Jan 27 15:06:52 crc systemd[1]: Starting Kubernetes Kubelet... Jan 27 15:06:52 crc restorecon[4741]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:52 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 15:06:53 crc restorecon[4741]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 27 15:06:54 crc kubenswrapper[4772]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 15:06:54 crc kubenswrapper[4772]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 27 15:06:54 crc kubenswrapper[4772]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 15:06:54 crc kubenswrapper[4772]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 15:06:54 crc kubenswrapper[4772]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 27 15:06:54 crc kubenswrapper[4772]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.430419 4772 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.434993 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435018 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435024 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435032 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435038 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435044 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435049 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435055 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435061 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435066 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435071 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435078 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435083 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435089 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435094 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435099 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435104 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435110 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435115 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435120 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435125 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435130 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435135 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435141 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435146 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435151 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435156 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435161 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435183 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435189 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435194 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435199 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435204 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435212 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435220 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435226 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435233 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435239 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435244 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435250 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435257 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435262 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435268 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435273 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435278 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435285 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435292 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435300 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435307 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435314 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435321 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435327 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435334 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435341 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435349 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435358 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435365 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435377 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435383 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435388 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435394 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435401 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435408 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435413 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435419 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435425 4772 feature_gate.go:330] unrecognized feature gate: Example Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435430 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435437 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435442 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435447 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.435452 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435591 4772 flags.go:64] FLAG: --address="0.0.0.0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435604 4772 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435614 4772 flags.go:64] FLAG: --anonymous-auth="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435622 4772 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435630 4772 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435636 4772 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435644 4772 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435652 4772 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435659 4772 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435665 4772 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435672 4772 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435682 4772 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435688 4772 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435694 4772 flags.go:64] FLAG: --cgroup-root="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435700 4772 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435707 4772 flags.go:64] FLAG: --client-ca-file="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435714 4772 flags.go:64] FLAG: --cloud-config="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435720 4772 flags.go:64] FLAG: --cloud-provider="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435726 4772 flags.go:64] FLAG: --cluster-dns="[]" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435734 4772 flags.go:64] FLAG: --cluster-domain="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435740 4772 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435746 4772 flags.go:64] FLAG: --config-dir="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435752 4772 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435786 4772 flags.go:64] FLAG: --container-log-max-files="5" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435796 4772 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435803 4772 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435809 4772 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435815 4772 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435821 4772 flags.go:64] FLAG: --contention-profiling="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435827 4772 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435833 4772 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435840 4772 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435847 4772 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435866 4772 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435872 4772 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435878 4772 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435884 4772 flags.go:64] FLAG: --enable-load-reader="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435890 4772 flags.go:64] FLAG: --enable-server="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435896 4772 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435904 4772 flags.go:64] FLAG: --event-burst="100" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435911 4772 flags.go:64] FLAG: --event-qps="50" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435917 4772 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435924 4772 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435930 4772 flags.go:64] FLAG: --eviction-hard="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435937 4772 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435943 4772 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435949 4772 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435955 4772 flags.go:64] FLAG: --eviction-soft="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435961 4772 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435967 4772 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435973 4772 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435979 4772 flags.go:64] FLAG: --experimental-mounter-path="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435986 4772 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435992 4772 flags.go:64] FLAG: --fail-swap-on="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.435998 4772 flags.go:64] FLAG: --feature-gates="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436008 4772 flags.go:64] FLAG: --file-check-frequency="20s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436015 4772 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436021 4772 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436027 4772 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436034 4772 flags.go:64] FLAG: --healthz-port="10248" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436040 4772 flags.go:64] FLAG: --help="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436046 4772 flags.go:64] FLAG: --hostname-override="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436052 4772 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436058 4772 flags.go:64] FLAG: --http-check-frequency="20s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436064 4772 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436070 4772 flags.go:64] FLAG: --image-credential-provider-config="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436076 4772 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436082 4772 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436089 4772 flags.go:64] FLAG: --image-service-endpoint="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436095 4772 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436102 4772 flags.go:64] FLAG: --kube-api-burst="100" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436108 4772 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436114 4772 flags.go:64] FLAG: --kube-api-qps="50" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436121 4772 flags.go:64] FLAG: --kube-reserved="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436127 4772 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436132 4772 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436139 4772 flags.go:64] FLAG: --kubelet-cgroups="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436144 4772 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436151 4772 flags.go:64] FLAG: --lock-file="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436156 4772 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436163 4772 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436188 4772 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436197 4772 flags.go:64] FLAG: --log-json-split-stream="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436204 4772 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436210 4772 flags.go:64] FLAG: --log-text-split-stream="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436216 4772 flags.go:64] FLAG: --logging-format="text" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436222 4772 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436229 4772 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436235 4772 flags.go:64] FLAG: --manifest-url="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436241 4772 flags.go:64] FLAG: --manifest-url-header="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436249 4772 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436255 4772 flags.go:64] FLAG: --max-open-files="1000000" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436263 4772 flags.go:64] FLAG: --max-pods="110" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436269 4772 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436275 4772 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436281 4772 flags.go:64] FLAG: --memory-manager-policy="None" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436287 4772 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436293 4772 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436299 4772 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436305 4772 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436319 4772 flags.go:64] FLAG: --node-status-max-images="50" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436327 4772 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436333 4772 flags.go:64] FLAG: --oom-score-adj="-999" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436339 4772 flags.go:64] FLAG: --pod-cidr="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436347 4772 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436355 4772 flags.go:64] FLAG: --pod-manifest-path="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436361 4772 flags.go:64] FLAG: --pod-max-pids="-1" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436367 4772 flags.go:64] FLAG: --pods-per-core="0" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436374 4772 flags.go:64] FLAG: --port="10250" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436380 4772 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436386 4772 flags.go:64] FLAG: --provider-id="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436392 4772 flags.go:64] FLAG: --qos-reserved="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436398 4772 flags.go:64] FLAG: --read-only-port="10255" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436406 4772 flags.go:64] FLAG: --register-node="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436414 4772 flags.go:64] FLAG: --register-schedulable="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436422 4772 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436435 4772 flags.go:64] FLAG: --registry-burst="10" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436474 4772 flags.go:64] FLAG: --registry-qps="5" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436482 4772 flags.go:64] FLAG: --reserved-cpus="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436490 4772 flags.go:64] FLAG: --reserved-memory="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436500 4772 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436508 4772 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436515 4772 flags.go:64] FLAG: --rotate-certificates="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436521 4772 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436527 4772 flags.go:64] FLAG: --runonce="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436533 4772 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436539 4772 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436546 4772 flags.go:64] FLAG: --seccomp-default="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436552 4772 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436558 4772 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436564 4772 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436571 4772 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436577 4772 flags.go:64] FLAG: --storage-driver-password="root" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436583 4772 flags.go:64] FLAG: --storage-driver-secure="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436590 4772 flags.go:64] FLAG: --storage-driver-table="stats" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436596 4772 flags.go:64] FLAG: --storage-driver-user="root" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436602 4772 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436608 4772 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436614 4772 flags.go:64] FLAG: --system-cgroups="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436620 4772 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436632 4772 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436638 4772 flags.go:64] FLAG: --tls-cert-file="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436644 4772 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436652 4772 flags.go:64] FLAG: --tls-min-version="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436658 4772 flags.go:64] FLAG: --tls-private-key-file="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436671 4772 flags.go:64] FLAG: --topology-manager-policy="none" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436678 4772 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436684 4772 flags.go:64] FLAG: --topology-manager-scope="container" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436690 4772 flags.go:64] FLAG: --v="2" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436698 4772 flags.go:64] FLAG: --version="false" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436706 4772 flags.go:64] FLAG: --vmodule="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436713 4772 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.436720 4772 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436856 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436863 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436869 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436874 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436879 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436885 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436890 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436895 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436901 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436906 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436913 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436919 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436925 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436930 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436936 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436941 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436946 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436951 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436957 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436962 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436967 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436972 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436978 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436983 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436991 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.436996 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437002 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437007 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437014 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437020 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437026 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437031 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437037 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437042 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437048 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437053 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437058 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437063 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437069 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437076 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437082 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437089 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437096 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437102 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437109 4772 feature_gate.go:330] unrecognized feature gate: Example Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437115 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437121 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437127 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437134 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437141 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437148 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437155 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437161 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437192 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437200 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437206 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437216 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437223 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437229 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437240 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437249 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437256 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437263 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437270 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437277 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437285 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437291 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437297 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437304 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437313 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.437321 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.437341 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.446432 4772 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.446477 4772 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446585 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446602 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446608 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446614 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446622 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446630 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446636 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446642 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446647 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446653 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446659 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446665 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446671 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446676 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446683 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446688 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446694 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446699 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446705 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446710 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446715 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446720 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446724 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446729 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446734 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446740 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446745 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446750 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446754 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446759 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446764 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446769 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446774 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446780 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446786 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446791 4772 feature_gate.go:330] unrecognized feature gate: Example Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446796 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446800 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446805 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446810 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446815 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446820 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446825 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446830 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446835 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446839 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446844 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446849 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446854 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446860 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446868 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446874 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446879 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446884 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446889 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446894 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446899 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446904 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446908 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446913 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446918 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446923 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446928 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446933 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446938 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446944 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446950 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446956 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446962 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446968 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.446974 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.446984 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447156 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447183 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447189 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447195 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447200 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447206 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447211 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447216 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447221 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447226 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447232 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447238 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447243 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447247 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447253 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447258 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447263 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447268 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447273 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447278 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447283 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447288 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447292 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447297 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447302 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447307 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447312 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447317 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447321 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447326 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447331 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447336 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447341 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447347 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447352 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447357 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447362 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447366 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447373 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447379 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447385 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447390 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447397 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447403 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447410 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447415 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447420 4772 feature_gate.go:330] unrecognized feature gate: Example Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447425 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447429 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447435 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447440 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447446 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447451 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447457 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447463 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447469 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447474 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447479 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447485 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447491 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447497 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447502 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447507 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447513 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447518 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447523 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447528 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447533 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447538 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447543 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.447548 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.447555 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.447739 4772 server.go:940] "Client rotation is on, will bootstrap in background" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.454537 4772 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.454643 4772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.456730 4772 server.go:997] "Starting client certificate rotation" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.456764 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.458711 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-06 18:17:37.914092538 +0000 UTC Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.458929 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.490416 4772 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.491952 4772 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.494057 4772 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.512803 4772 log.go:25] "Validated CRI v1 runtime API" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.547156 4772 log.go:25] "Validated CRI v1 image API" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.548915 4772 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.554033 4772 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-27-15-01-40-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.554089 4772 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.575445 4772 manager.go:217] Machine: {Timestamp:2026-01-27 15:06:54.573479047 +0000 UTC m=+0.554088165 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3933c4f3-43c9-48b4-998d-ee6c7e3cb9de BootID:3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ef:3f:1d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ef:3f:1d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d9:12:63 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:88:df:b0 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:b0:26:c3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9d:1f:a3 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:99:7f:77 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:7b:72:df:55:0a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:cc:75:46:58:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.575747 4772 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.575910 4772 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.577891 4772 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.578066 4772 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.578106 4772 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.578306 4772 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.578318 4772 container_manager_linux.go:303] "Creating device plugin manager" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.579282 4772 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.579315 4772 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.579520 4772 state_mem.go:36] "Initialized new in-memory state store" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.579611 4772 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.584005 4772 kubelet.go:418] "Attempting to sync node with API server" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.584026 4772 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.584046 4772 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.584059 4772 kubelet.go:324] "Adding apiserver pod source" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.584070 4772 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.588119 4772 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.589064 4772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.590959 4772 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.592359 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.592462 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.592439 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.592518 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592906 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592935 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592945 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592954 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592967 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592974 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.592981 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.593004 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.593019 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.593026 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.593038 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.593472 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.595875 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.596376 4772 server.go:1280] "Started kubelet" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.596526 4772 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.596990 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:54 crc systemd[1]: Started Kubernetes Kubelet. Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.599572 4772 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.600652 4772 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.602114 4772 server.go:460] "Adding debug handlers to kubelet server" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.603485 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.603776 4772 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.604090 4772 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.604197 4772 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.603908 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 16:01:57.657047466 +0000 UTC Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.604072 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.134:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e9ee612af890f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:06:54.596344079 +0000 UTC m=+0.576953177,LastTimestamp:2026-01-27 15:06:54.596344079 +0000 UTC m=+0.576953177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.609846 4772 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.610251 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.610307 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.604145 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610526 4772 factory.go:55] Registering systemd factory Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610549 4772 factory.go:221] Registration of the systemd container factory successfully Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610773 4772 factory.go:153] Registering CRI-O factory Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610787 4772 factory.go:221] Registration of the crio container factory successfully Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610903 4772 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610929 4772 factory.go:103] Registering Raw factory Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.610944 4772 manager.go:1196] Started watching for new ooms in manager Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.610966 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="200ms" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.611947 4772 manager.go:319] Starting recovery of all containers Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621719 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621819 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621834 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621847 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621865 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621882 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621897 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621912 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621925 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621937 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.621975 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622007 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622031 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622046 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622068 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622080 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622114 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622137 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622162 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622256 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622288 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622306 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622353 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622384 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622402 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622455 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.622504 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.625437 4772 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.625633 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.625747 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.625860 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.625962 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626064 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626152 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626303 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626404 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626487 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626569 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626647 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626726 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626804 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.626990 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627098 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627222 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627314 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627419 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627532 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627613 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627734 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627872 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.627963 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628051 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628130 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628244 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628336 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628417 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628497 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628605 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628692 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628774 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.628888 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629017 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629146 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629352 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629450 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629544 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629663 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629806 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629906 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.629984 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630067 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630221 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630345 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630438 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630520 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630626 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630769 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630562 4772 manager.go:324] Recovery completed Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.630891 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631023 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631044 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631057 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631069 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631081 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631094 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631105 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631117 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631131 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631143 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631153 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631179 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631194 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631205 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631216 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631227 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631239 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631252 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631264 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631278 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631289 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631301 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631314 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631327 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631413 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631451 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631475 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631497 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.631980 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632022 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632040 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632054 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632069 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632083 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632096 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632108 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632122 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632136 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632149 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632178 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632194 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632207 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632219 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632231 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632243 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632255 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632267 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632279 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632291 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632304 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632317 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632332 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632345 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632358 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632370 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632382 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632396 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632408 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632421 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632434 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632449 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632460 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632472 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632486 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632498 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632511 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632549 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632562 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632576 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632590 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632606 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632618 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632633 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632647 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632660 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632673 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632686 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632699 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632715 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632730 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632743 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632756 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632768 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632781 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632795 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632807 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632823 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632837 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632851 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632865 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632877 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632889 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632902 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632915 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632928 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632942 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632954 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632969 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632982 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.632995 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633008 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633021 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633038 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633050 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633063 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633075 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633087 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633098 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633110 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633122 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633134 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633146 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633160 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633189 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633202 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633214 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633227 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633239 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633250 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633261 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633275 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633287 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633299 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633311 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633329 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633342 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633354 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633365 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633377 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633390 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633403 4772 reconstruct.go:97] "Volume reconstruction finished" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.633412 4772 reconciler.go:26] "Reconciler: start to sync state" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.647626 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.652039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.652083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.652095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.652979 4772 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.653002 4772 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.653069 4772 state_mem.go:36] "Initialized new in-memory state store" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.659380 4772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.661651 4772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.661687 4772 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.661716 4772 kubelet.go:2335] "Starting kubelet main sync loop" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.661843 4772 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 15:06:54 crc kubenswrapper[4772]: W0127 15:06:54.664463 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.664554 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.667827 4772 policy_none.go:49] "None policy: Start" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.669080 4772 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.669122 4772 state_mem.go:35] "Initializing new in-memory state store" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.711398 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.721388 4772 manager.go:334] "Starting Device Plugin manager" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.721634 4772 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.721657 4772 server.go:79] "Starting device plugin registration server" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.722204 4772 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.722225 4772 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.722436 4772 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.722532 4772 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.722551 4772 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.729555 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.762620 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.762696 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.763713 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.763745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.763757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.763917 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764180 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764226 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764598 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764758 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.764776 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765137 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765180 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765197 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765339 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765511 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765534 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.765960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766025 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766102 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766383 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766774 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.766803 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.767294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.767323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.767331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.767388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.767403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.767413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.811695 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="400ms" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.822318 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.823470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.823513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.823528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.823585 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.824099 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.134:6443: connect: connection refused" node="crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.835956 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.835997 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836043 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836063 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836106 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836148 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836197 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836249 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836284 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836312 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836343 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836370 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.836396 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937841 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937885 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937932 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937976 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938015 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938017 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938061 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938022 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938089 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938147 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938163 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.937955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938251 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938152 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938350 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938384 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938414 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938424 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938482 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938503 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938547 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938576 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938579 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938619 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938676 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: I0127 15:06:54.938832 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 15:06:54 crc kubenswrapper[4772]: E0127 15:06:54.951198 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.134:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e9ee612af890f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:06:54.596344079 +0000 UTC m=+0.576953177,LastTimestamp:2026-01-27 15:06:54.596344079 +0000 UTC m=+0.576953177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.025068 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.026548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.026584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.026596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.026623 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:06:55 crc kubenswrapper[4772]: E0127 15:06:55.027055 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.134:6443: connect: connection refused" node="crc" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.100395 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.116439 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.123796 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.130364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.131923 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.156253 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-b7446aa0404e33f408195fbb0a5d7000258cc2dcccf533b87b03def949c8451a WatchSource:0}: Error finding container b7446aa0404e33f408195fbb0a5d7000258cc2dcccf533b87b03def949c8451a: Status 404 returned error can't find the container with id b7446aa0404e33f408195fbb0a5d7000258cc2dcccf533b87b03def949c8451a Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.157853 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-07214d993250107ab3c355de4cbedb841547f07ba78eb819ae7a8cf2f85d3254 WatchSource:0}: Error finding container 07214d993250107ab3c355de4cbedb841547f07ba78eb819ae7a8cf2f85d3254: Status 404 returned error can't find the container with id 07214d993250107ab3c355de4cbedb841547f07ba78eb819ae7a8cf2f85d3254 Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.162636 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-fe47d7b7a8f551a26eb5563986df0792e2e7780fe682fc344ae48aad69d582e6 WatchSource:0}: Error finding container fe47d7b7a8f551a26eb5563986df0792e2e7780fe682fc344ae48aad69d582e6: Status 404 returned error can't find the container with id fe47d7b7a8f551a26eb5563986df0792e2e7780fe682fc344ae48aad69d582e6 Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.163555 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b64d7a630d80b6360b1f1a17d498dd9dfef664e1e946d3bb2d0127cea3dc9104 WatchSource:0}: Error finding container b64d7a630d80b6360b1f1a17d498dd9dfef664e1e946d3bb2d0127cea3dc9104: Status 404 returned error can't find the container with id b64d7a630d80b6360b1f1a17d498dd9dfef664e1e946d3bb2d0127cea3dc9104 Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.165918 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-86d5acd4da3e8815a6f066738d8ae875fe76d45ca3c619c2319b08ea2150cd45 WatchSource:0}: Error finding container 86d5acd4da3e8815a6f066738d8ae875fe76d45ca3c619c2319b08ea2150cd45: Status 404 returned error can't find the container with id 86d5acd4da3e8815a6f066738d8ae875fe76d45ca3c619c2319b08ea2150cd45 Jan 27 15:06:55 crc kubenswrapper[4772]: E0127 15:06:55.212529 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="800ms" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.427755 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.428922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.428959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.428971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.429002 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:06:55 crc kubenswrapper[4772]: E0127 15:06:55.429597 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.134:6443: connect: connection refused" node="crc" Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.557865 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:55 crc kubenswrapper[4772]: E0127 15:06:55.557958 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.598001 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.610133 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 03:10:21.574310172 +0000 UTC Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.635797 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:55 crc kubenswrapper[4772]: E0127 15:06:55.635892 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.670266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"86d5acd4da3e8815a6f066738d8ae875fe76d45ca3c619c2319b08ea2150cd45"} Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.671471 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fe47d7b7a8f551a26eb5563986df0792e2e7780fe682fc344ae48aad69d582e6"} Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.672209 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07214d993250107ab3c355de4cbedb841547f07ba78eb819ae7a8cf2f85d3254"} Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.673077 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b7446aa0404e33f408195fbb0a5d7000258cc2dcccf533b87b03def949c8451a"} Jan 27 15:06:55 crc kubenswrapper[4772]: I0127 15:06:55.673758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b64d7a630d80b6360b1f1a17d498dd9dfef664e1e946d3bb2d0127cea3dc9104"} Jan 27 15:06:55 crc kubenswrapper[4772]: W0127 15:06:55.852694 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:55 crc kubenswrapper[4772]: E0127 15:06:55.852763 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:56 crc kubenswrapper[4772]: E0127 15:06:56.014754 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="1.6s" Jan 27 15:06:56 crc kubenswrapper[4772]: W0127 15:06:56.183768 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:56 crc kubenswrapper[4772]: E0127 15:06:56.183858 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.229714 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.231866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.231907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.231918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.231943 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:06:56 crc kubenswrapper[4772]: E0127 15:06:56.233104 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.134:6443: connect: connection refused" node="crc" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.599030 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.611249 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.611269 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:14:09.226128005 +0000 UTC Jan 27 15:06:56 crc kubenswrapper[4772]: E0127 15:06:56.612549 4772 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.678293 4772 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20" exitCode=0 Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.678389 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.678413 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.679765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.679828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.679846 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.680912 4772 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8a8432f1dcc97ca9d30249542dfebc79f098859c0f7e04a637be764939fb6072" exitCode=0 Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.681021 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8a8432f1dcc97ca9d30249542dfebc79f098859c0f7e04a637be764939fb6072"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.681180 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.682236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.682277 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.682292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.684229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.684268 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.684269 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.684386 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.684399 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.685286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.685321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.685333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.688404 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5" exitCode=0 Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.688449 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.688637 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.689806 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9" exitCode=0 Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.689841 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9"} Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.689956 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.690064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.690091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.690105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.691011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.691032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.691043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.691765 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.692613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.692645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:56 crc kubenswrapper[4772]: I0127 15:06:56.692688 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.023302 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.064507 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:57 crc kubenswrapper[4772]: W0127 15:06:57.347344 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:57 crc kubenswrapper[4772]: E0127 15:06:57.347431 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.482258 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.598434 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.611496 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 00:54:04.935516693 +0000 UTC Jan 27 15:06:57 crc kubenswrapper[4772]: E0127 15:06:57.615220 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="3.2s" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.694980 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"109762c1c77a786a3c953ead1b65a17e9401f8ba205858ffc79f6e188d5005df"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.695094 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.696473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.696511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.696525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.700804 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.700861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.700876 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.700891 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.700902 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.700906 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.702469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.702522 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.702540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.704934 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75" exitCode=0 Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.705037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.705053 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.706569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.706602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.706616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.709681 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.709738 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.709756 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7"} Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.709757 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.709817 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.711390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.711427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.711442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.712262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.712295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.712312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.735788 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.833819 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.834844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.834876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.834885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:57 crc kubenswrapper[4772]: I0127 15:06:57.834904 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:06:57 crc kubenswrapper[4772]: E0127 15:06:57.835282 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.134:6443: connect: connection refused" node="crc" Jan 27 15:06:58 crc kubenswrapper[4772]: W0127 15:06:58.059626 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:58 crc kubenswrapper[4772]: E0127 15:06:58.059713 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.263344 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.525256 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.525341 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.525372 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.598626 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.612446 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:53:02.965827067 +0000 UTC Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.716395 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.718558 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded" exitCode=255 Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.718642 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded"} Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.718654 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.719596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.719626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.719637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.720214 4772 scope.go:117] "RemoveContainer" containerID="21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.721068 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970" exitCode=0 Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.721154 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.721646 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.721980 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970"} Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.722020 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.722064 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.722067 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723078 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723802 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:58 crc kubenswrapper[4772]: I0127 15:06:58.723813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:58 crc kubenswrapper[4772]: W0127 15:06:58.741557 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:58 crc kubenswrapper[4772]: E0127 15:06:58.741632 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:58 crc kubenswrapper[4772]: W0127 15:06:58.831317 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.134:6443: connect: connection refused Jan 27 15:06:58 crc kubenswrapper[4772]: E0127 15:06:58.831428 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.134:6443: connect: connection refused" logger="UnhandledError" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.612710 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 17:11:10.198188402 +0000 UTC Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.727980 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.730551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67"} Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.730598 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.732362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.732413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.732435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.736015 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1"} Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.736060 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c"} Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.736079 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10"} Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.736088 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.736090 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326"} Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.737202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.737231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:06:59 crc kubenswrapper[4772]: I0127 15:06:59.737240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.613629 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 03:03:39.705395721 +0000 UTC Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.743488 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5"} Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.743566 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.743601 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.743637 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.745497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.745515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.745548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.745550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.745569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.745572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:00 crc kubenswrapper[4772]: I0127 15:07:00.931669 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.035445 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.036922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.036953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.036962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.037006 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.287129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.615387 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 14:19:20.540339678 +0000 UTC Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.745624 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.745624 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.746903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.746920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.746944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.746950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.746960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:01 crc kubenswrapper[4772]: I0127 15:07:01.746963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.616135 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 21:17:13.608597062 +0000 UTC Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.672573 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.747786 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.747893 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.748967 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.749085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.749161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.749340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.749399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:02 crc kubenswrapper[4772]: I0127 15:07:02.749414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.524906 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.525049 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.525088 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.526332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.526395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.526421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:03 crc kubenswrapper[4772]: I0127 15:07:03.616722 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 16:11:23.891749624 +0000 UTC Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.542203 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.542596 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.544638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.544677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.544687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.617272 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 08:37:21.852171994 +0000 UTC Jan 27 15:07:04 crc kubenswrapper[4772]: E0127 15:07:04.729714 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.730858 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.731006 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.731989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.732033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:04 crc kubenswrapper[4772]: I0127 15:07:04.732048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:05 crc kubenswrapper[4772]: I0127 15:07:05.617515 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 04:48:44.110438372 +0000 UTC Jan 27 15:07:06 crc kubenswrapper[4772]: I0127 15:07:06.525668 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 15:07:06 crc kubenswrapper[4772]: I0127 15:07:06.525790 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 15:07:06 crc kubenswrapper[4772]: I0127 15:07:06.617963 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 09:01:26.846257702 +0000 UTC Jan 27 15:07:07 crc kubenswrapper[4772]: I0127 15:07:07.618722 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 12:03:43.652096101 +0000 UTC Jan 27 15:07:08 crc kubenswrapper[4772]: I0127 15:07:08.263914 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 15:07:08 crc kubenswrapper[4772]: I0127 15:07:08.263994 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 15:07:08 crc kubenswrapper[4772]: I0127 15:07:08.619760 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 11:19:31.526392386 +0000 UTC Jan 27 15:07:09 crc kubenswrapper[4772]: I0127 15:07:09.459928 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 27 15:07:09 crc kubenswrapper[4772]: I0127 15:07:09.459982 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 15:07:09 crc kubenswrapper[4772]: I0127 15:07:09.467624 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 27 15:07:09 crc kubenswrapper[4772]: I0127 15:07:09.467695 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 15:07:09 crc kubenswrapper[4772]: I0127 15:07:09.620649 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 20:41:08.188648925 +0000 UTC Jan 27 15:07:10 crc kubenswrapper[4772]: I0127 15:07:10.621367 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 14:04:22.013766922 +0000 UTC Jan 27 15:07:11 crc kubenswrapper[4772]: I0127 15:07:11.622246 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 18:50:53.989294917 +0000 UTC Jan 27 15:07:12 crc kubenswrapper[4772]: I0127 15:07:12.623089 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 03:51:42.238454132 +0000 UTC Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.531653 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.531810 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.533511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.533559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.533573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.539030 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.623903 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 08:13:33.931890479 +0000 UTC Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.779155 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.780812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.780841 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:13 crc kubenswrapper[4772]: I0127 15:07:13.780850 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.440390 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.442696 4772 trace.go:236] Trace[736418201]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:07:02.632) (total time: 11810ms): Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[736418201]: ---"Objects listed" error: 11810ms (15:07:14.442) Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[736418201]: [11.810231167s] [11.810231167s] END Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.442723 4772 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.443893 4772 trace.go:236] Trace[1971219875]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:07:03.607) (total time: 10836ms): Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[1971219875]: ---"Objects listed" error: 10836ms (15:07:14.443) Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[1971219875]: [10.836082889s] [10.836082889s] END Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.443914 4772 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.444382 4772 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.444458 4772 trace.go:236] Trace[261586842]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:07:04.320) (total time: 10124ms): Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[261586842]: ---"Objects listed" error: 10124ms (15:07:14.444) Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[261586842]: [10.124373696s] [10.124373696s] END Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.444472 4772 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.445257 4772 trace.go:236] Trace[1285288433]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:07:02.744) (total time: 11700ms): Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[1285288433]: ---"Objects listed" error: 11700ms (15:07:14.445) Jan 27 15:07:14 crc kubenswrapper[4772]: Trace[1285288433]: [11.70081603s] [11.70081603s] END Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.445277 4772 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.446500 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.466993 4772 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.482011 4772 csr.go:261] certificate signing request csr-2bhmf is approved, waiting to be issued Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.491646 4772 csr.go:257] certificate signing request csr-2bhmf is issued Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.495738 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.500489 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.501113 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.568231 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.579254 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.624843 4772 apiserver.go:52] "Watching apiserver" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.624852 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:28:14.708608036 +0000 UTC Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.628136 4772 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.628498 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.628799 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.628907 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.628929 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.628972 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.629097 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.629197 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.629207 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.629271 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.629478 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.632298 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.632447 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.633384 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.634114 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.634243 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.635638 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.635995 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.636054 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.640022 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.650218 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.666513 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.678118 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.693462 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.699999 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44474->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.700074 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:44474->192.168.126.11:17697: read: connection reset by peer" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.700362 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.700389 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.709090 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.710646 4772 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.728901 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.744914 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745036 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745072 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745087 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745117 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745133 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745150 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745189 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745213 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745237 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745254 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745268 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745285 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745340 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745357 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745404 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745421 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745443 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745461 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745476 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745491 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745510 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745514 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745528 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745545 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745586 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745556 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745611 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745632 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745651 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745667 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745684 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745716 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745732 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745759 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745787 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745815 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745840 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745929 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745945 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745961 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745977 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745996 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746013 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746028 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746042 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746058 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746076 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746123 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746141 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746160 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746206 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746230 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746254 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746271 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746287 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746305 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746320 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746338 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746386 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746407 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746448 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746464 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746495 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746510 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746546 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746563 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746578 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746592 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746608 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746623 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746643 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746659 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746674 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746690 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746704 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746718 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746734 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746750 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746764 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746779 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746797 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746812 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746828 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746844 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746861 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746879 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746895 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746911 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746926 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746941 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746958 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746974 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746990 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747005 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747022 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745556 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745726 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745674 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745736 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745766 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.745953 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746021 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746140 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746227 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747160 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747194 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746225 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746320 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746443 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747223 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746497 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746644 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746751 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746801 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746964 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746976 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.746966 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747293 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747037 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747341 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747404 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747408 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747454 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747459 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747475 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747499 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747517 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747532 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747547 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747563 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747563 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747578 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747595 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747611 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747622 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747627 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747681 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747712 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747738 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747762 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747789 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747811 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747834 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747859 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747882 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747903 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747924 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747946 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747966 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.747985 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748005 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748028 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748048 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748069 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748091 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748095 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748117 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748141 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748184 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748209 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748233 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748256 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748278 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748299 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748321 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748362 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748384 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748430 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748453 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748478 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748502 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748525 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748552 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748574 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748596 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748622 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748642 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748686 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748693 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748709 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748734 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748759 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748781 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748802 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748823 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748846 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748870 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.748893 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.749373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.749672 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.749704 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.749982 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.750041 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.750052 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.750536 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.750872 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.750907 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.750934 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.751157 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.751582 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.751591 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.751858 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.752083 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.752570 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.752647 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.752813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.753174 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.753539 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.753656 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.753740 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.753804 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.754302 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.754647 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.754892 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.754943 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755152 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755524 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755673 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755713 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755996 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756156 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756249 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.753958 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756398 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756710 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756763 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.756749 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757101 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757338 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757299 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.755850 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757702 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757826 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757828 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757844 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757839 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757926 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757957 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.757977 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.758002 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.758136 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.758383 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.758671 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.758708 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.760708 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.760734 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.760824 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.761726 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.762611 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.762627 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.762644 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.762660 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.762753 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.762782 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764287 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.763160 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764338 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764406 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764720 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764760 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.758024 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764920 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764959 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764988 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765015 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765042 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765066 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765093 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765120 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765145 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765191 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765216 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765241 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765264 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765289 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765313 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765335 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765445 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765472 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765497 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765522 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765545 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765568 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765592 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765615 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765637 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765736 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765825 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765853 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765876 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765926 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765953 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765977 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766070 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766154 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766188 4772 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766201 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766214 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766227 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766239 4772 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766259 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766272 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766285 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766298 4772 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766311 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766323 4772 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766335 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766348 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766361 4772 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766374 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766386 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766398 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766410 4772 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766423 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766435 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766447 4772 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766460 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766474 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766487 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766499 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766511 4772 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766523 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766534 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766547 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766559 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766571 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766584 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766596 4772 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766608 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766621 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766634 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766660 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766673 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766686 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766699 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766713 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766724 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766736 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766748 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766761 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766773 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766785 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766796 4772 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766808 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766820 4772 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766832 4772 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766843 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766856 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766869 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766881 4772 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766892 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766904 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766916 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766927 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766939 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766952 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766963 4772 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766975 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766987 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.766999 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767011 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767022 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767033 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767045 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767057 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767070 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767082 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767095 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767108 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767120 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767132 4772 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767144 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767156 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767220 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767234 4772 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767247 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767259 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767271 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767282 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767294 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767307 4772 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767318 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767330 4772 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767342 4772 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767389 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767402 4772 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767413 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767424 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767436 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767447 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767458 4772 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767470 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767481 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767493 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767504 4772 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767546 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.767946 4772 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764249 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.764641 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765075 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765483 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765739 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.765946 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769016 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769080 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769047 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769340 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769376 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769400 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769546 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769847 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.769907 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.770110 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.770396 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.770415 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.770700 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.770763 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.771103 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.771436 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.771586 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.771730 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.772390 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.772466 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.772713 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.772904 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.773437 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.773700 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.773897 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.774236 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.774510 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.774839 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.774915 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.775069 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.775603 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.776153 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.776618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.776894 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.777374 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.777722 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.777966 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.778757 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.778967 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779029 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.779153 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.779221 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:15.279203022 +0000 UTC m=+21.259812210 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779295 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779448 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779709 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.779809 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:07:15.279795759 +0000 UTC m=+21.260404967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779917 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779927 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779934 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.779959 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.774281 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780149 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780223 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780249 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780369 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780602 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780743 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.780868 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781081 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.781106 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781116 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.781194 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:15.281154317 +0000 UTC m=+21.261763515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781350 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781502 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.781770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.782011 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.782371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.782432 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.782625 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.782810 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.785881 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.785899 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.785914 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.785966 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:15.285950913 +0000 UTC m=+21.266560081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.786209 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.774397 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.787431 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.787911 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.789238 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.789679 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.789734 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.789882 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.790423 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.790534 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67" exitCode=255 Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.790619 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.790661 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.790672 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.790730 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:15.290697797 +0000 UTC m=+21.271306895 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.790756 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67"} Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.790824 4772 scope.go:117] "RemoveContainer" containerID="21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.790884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.791748 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.791951 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.792000 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.792123 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.793461 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.794954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.796758 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.798027 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.799802 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.800106 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.802526 4772 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.806275 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.812317 4772 scope.go:117] "RemoveContainer" containerID="a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67" Jan 27 15:07:14 crc kubenswrapper[4772]: E0127 15:07:14.812733 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.812776 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.813049 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.813154 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.815422 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.816337 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.817554 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.820078 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.821635 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.826648 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.832814 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.837181 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.846633 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.858437 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.867897 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.867944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868015 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868029 4772 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868040 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868097 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868108 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868117 4772 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868128 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868137 4772 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868146 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868156 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868181 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868193 4772 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868204 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868215 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868225 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868235 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868245 4772 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868255 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868265 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868276 4772 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868287 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868297 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868306 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868316 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868328 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868337 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868347 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868358 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868368 4772 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868379 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868389 4772 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868399 4772 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868411 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868422 4772 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868433 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868444 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868466 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868478 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868488 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868498 4772 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868508 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868519 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868530 4772 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868540 4772 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868551 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868563 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868573 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868594 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868607 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868618 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868627 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868638 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868648 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868658 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868668 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868678 4772 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868690 4772 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868700 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868710 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868719 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868728 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868737 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868747 4772 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868757 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868766 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868776 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868785 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868794 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868804 4772 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868814 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868823 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868832 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868842 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868850 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868861 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868870 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868880 4772 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868890 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868901 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868912 4772 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868922 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868933 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868946 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868958 4772 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868968 4772 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868978 4772 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.868989 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869002 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869014 4772 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869024 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869035 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869045 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869093 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.869444 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.870322 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.881074 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.894375 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"message\\\":\\\"W0127 15:06:57.758128 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 15:06:57.758449 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769526417 cert, and key in /tmp/serving-cert-633789686/serving-signer.crt, /tmp/serving-cert-633789686/serving-signer.key\\\\nI0127 15:06:58.043108 1 observer_polling.go:159] Starting file observer\\\\nW0127 15:06:58.045578 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:06:58.045739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:06:58.047315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-633789686/tls.crt::/tmp/serving-cert-633789686/tls.key\\\\\\\"\\\\nF0127 15:06:58.297465 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.906708 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.933670 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.943603 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.954611 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.961370 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:07:14 crc kubenswrapper[4772]: I0127 15:07:14.982019 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:07:14 crc kubenswrapper[4772]: W0127 15:07:14.989333 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0c9c8240549313c1fb724b3845fa904773be10bf9695748d577f7630d7ee8265 WatchSource:0}: Error finding container 0c9c8240549313c1fb724b3845fa904773be10bf9695748d577f7630d7ee8265: Status 404 returned error can't find the container with id 0c9c8240549313c1fb724b3845fa904773be10bf9695748d577f7630d7ee8265 Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.376879 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377071 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:07:16.377043392 +0000 UTC m=+22.357652490 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.377291 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.377317 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.377334 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.377565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377604 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377636 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377669 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:16.3776574 +0000 UTC m=+22.358266498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377750 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:16.377725342 +0000 UTC m=+22.358334510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377758 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377795 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377807 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377892 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:16.377877206 +0000 UTC m=+22.358486304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377954 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377963 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377970 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.377992 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:16.377986039 +0000 UTC m=+22.358595127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.379826 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.493130 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-27 15:02:14 +0000 UTC, rotation deadline is 2026-11-07 11:43:22.531797809 +0000 UTC Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.493200 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6812h36m7.038599849s for next certificate rotation Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.625834 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:28:43.659872933 +0000 UTC Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.662157 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.662289 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.794145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449"} Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.794205 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5a238964e704642f78de8ef46fa665accf609f3d0d53e2d04174c20478840530"} Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.795213 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.796864 4772 scope.go:117] "RemoveContainer" containerID="a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67" Jan 27 15:07:15 crc kubenswrapper[4772]: E0127 15:07:15.797000 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.797109 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0c9c8240549313c1fb724b3845fa904773be10bf9695748d577f7630d7ee8265"} Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.798372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842"} Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.798398 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015"} Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.798412 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6d2ab5bedc3e93a9e7981a953481fe73478caf4bfc9f40829609d914429ca3ec"} Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.813255 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21e07984f4ca6d9ea37d8213eab7f36c2a5342806beacdb07b50a15ffba13ded\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"message\\\":\\\"W0127 15:06:57.758128 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 15:06:57.758449 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769526417 cert, and key in /tmp/serving-cert-633789686/serving-signer.crt, /tmp/serving-cert-633789686/serving-signer.key\\\\nI0127 15:06:58.043108 1 observer_polling.go:159] Starting file observer\\\\nW0127 15:06:58.045578 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:06:58.045739 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:06:58.047315 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-633789686/tls.crt::/tmp/serving-cert-633789686/tls.key\\\\\\\"\\\\nF0127 15:06:58.297465 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.829087 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.842214 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.861646 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.876852 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.889224 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.903247 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.917849 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.934761 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.948299 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.960553 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.973959 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:15 crc kubenswrapper[4772]: I0127 15:07:15.996383 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.007873 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.035663 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.049688 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.087941 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.107371 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.314606 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dtdj6"] Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.315217 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-q46tm"] Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.315413 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.315555 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318097 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318308 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318325 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318434 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318470 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318573 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.318624 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.341302 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.365647 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.382613 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.386414 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.386524 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-serviceca\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.386558 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.386645 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.386667 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:07:18.386553619 +0000 UTC m=+24.367162727 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.386777 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.386922 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:18.386901669 +0000 UTC m=+24.367510767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.386965 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.386985 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.386997 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387058 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:18.387048383 +0000 UTC m=+24.367657661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.387121 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-host\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.387210 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.387247 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pswh\" (UniqueName: \"kubernetes.io/projected/95a893d4-4faa-40b2-b505-9698fe428ba8-kube-api-access-7pswh\") pod \"node-resolver-dtdj6\" (UID: \"95a893d4-4faa-40b2-b505-9698fe428ba8\") " pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.387306 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387360 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387379 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.387379 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95a893d4-4faa-40b2-b505-9698fe428ba8-hosts-file\") pod \"node-resolver-dtdj6\" (UID: \"95a893d4-4faa-40b2-b505-9698fe428ba8\") " pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387394 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387428 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:18.387419893 +0000 UTC m=+24.368028981 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.387444 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qln7r\" (UniqueName: \"kubernetes.io/projected/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-kube-api-access-qln7r\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387483 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.387547 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:18.387540337 +0000 UTC m=+24.368149435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.399669 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.416366 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.433461 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.447280 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.462345 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.473674 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.488804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pswh\" (UniqueName: \"kubernetes.io/projected/95a893d4-4faa-40b2-b505-9698fe428ba8-kube-api-access-7pswh\") pod \"node-resolver-dtdj6\" (UID: \"95a893d4-4faa-40b2-b505-9698fe428ba8\") " pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.488848 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95a893d4-4faa-40b2-b505-9698fe428ba8-hosts-file\") pod \"node-resolver-dtdj6\" (UID: \"95a893d4-4faa-40b2-b505-9698fe428ba8\") " pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.488872 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qln7r\" (UniqueName: \"kubernetes.io/projected/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-kube-api-access-qln7r\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.488908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-serviceca\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.488939 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-host\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.488976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/95a893d4-4faa-40b2-b505-9698fe428ba8-hosts-file\") pod \"node-resolver-dtdj6\" (UID: \"95a893d4-4faa-40b2-b505-9698fe428ba8\") " pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.489012 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-host\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.490990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-serviceca\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.504408 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.515440 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qln7r\" (UniqueName: \"kubernetes.io/projected/fed65bae-f1c4-4c97-bb6d-d4144fe2532b-kube-api-access-qln7r\") pod \"node-ca-q46tm\" (UID: \"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\") " pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.526661 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pswh\" (UniqueName: \"kubernetes.io/projected/95a893d4-4faa-40b2-b505-9698fe428ba8-kube-api-access-7pswh\") pod \"node-resolver-dtdj6\" (UID: \"95a893d4-4faa-40b2-b505-9698fe428ba8\") " pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.533948 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.580595 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.597734 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.618195 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.626876 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:41:00.562975131 +0000 UTC Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.631018 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dtdj6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.638130 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q46tm" Jan 27 15:07:16 crc kubenswrapper[4772]: W0127 15:07:16.645078 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a893d4_4faa_40b2_b505_9698fe428ba8.slice/crio-1c0961d39afdfce06462f956256b15a99563f79b26cbab0f99cd2451b4fad029 WatchSource:0}: Error finding container 1c0961d39afdfce06462f956256b15a99563f79b26cbab0f99cd2451b4fad029: Status 404 returned error can't find the container with id 1c0961d39afdfce06462f956256b15a99563f79b26cbab0f99cd2451b4fad029 Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.664396 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.664986 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.665118 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.665234 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.669785 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.682542 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.683435 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.685519 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.686256 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.691898 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: W0127 15:07:16.692314 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed65bae_f1c4_4c97_bb6d_d4144fe2532b.slice/crio-a3e708439b4fd068181bd4d2a3735a29f3f7f68a4cb602ed80fdec2f8dd0e6da WatchSource:0}: Error finding container a3e708439b4fd068181bd4d2a3735a29f3f7f68a4cb602ed80fdec2f8dd0e6da: Status 404 returned error can't find the container with id a3e708439b4fd068181bd4d2a3735a29f3f7f68a4cb602ed80fdec2f8dd0e6da Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.692904 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.693139 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.693719 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.695043 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.695883 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.697041 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.697747 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.699465 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.700118 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.700885 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.702071 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.705435 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.706511 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.713611 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.714887 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.715893 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.717325 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.718004 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.718539 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.720480 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.721022 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.725817 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.726768 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.727875 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.728707 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.729762 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.730355 4772 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.730534 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.731772 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.733096 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.733826 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.734606 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.736507 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.737732 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.738478 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.739904 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.740764 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.741889 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.742672 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.743932 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.745272 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.745831 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.746504 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.747659 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.749006 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.749730 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.750376 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.751651 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.752238 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.752589 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.754278 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.755108 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.766616 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.795075 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.801774 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q46tm" event={"ID":"fed65bae-f1c4-4c97-bb6d-d4144fe2532b","Type":"ContainerStarted","Data":"a3e708439b4fd068181bd4d2a3735a29f3f7f68a4cb602ed80fdec2f8dd0e6da"} Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.804073 4772 scope.go:117] "RemoveContainer" containerID="a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.804195 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dtdj6" event={"ID":"95a893d4-4faa-40b2-b505-9698fe428ba8","Type":"ContainerStarted","Data":"1c0961d39afdfce06462f956256b15a99563f79b26cbab0f99cd2451b4fad029"} Jan 27 15:07:16 crc kubenswrapper[4772]: E0127 15:07:16.804358 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 15:07:16 crc kubenswrapper[4772]: I0127 15:07:16.810392 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:16Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.122756 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4hwxn"] Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.123049 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-c7pdz"] Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.123277 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.125143 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.125408 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.125941 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.126147 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.126501 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.127132 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-x7jwx"] Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.127343 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.127908 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.133148 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.133357 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.133692 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.133728 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.133866 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.134071 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.136815 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.145727 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.162312 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.174707 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.186785 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.194816 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67794a44-d793-4fd7-9e54-e40437f67c0b-proxy-tls\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.194886 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67794a44-d793-4fd7-9e54-e40437f67c0b-mcd-auth-proxy-config\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.194915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh6ph\" (UniqueName: \"kubernetes.io/projected/67794a44-d793-4fd7-9e54-e40437f67c0b-kube-api-access-lh6ph\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.195009 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/67794a44-d793-4fd7-9e54-e40437f67c0b-rootfs\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.200664 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.212121 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.224745 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.239011 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.252891 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.264457 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.273281 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.292321 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.295950 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-kubelet\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.295987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-hostroot\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296003 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-daemon-config\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296024 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/67794a44-d793-4fd7-9e54-e40437f67c0b-rootfs\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67794a44-d793-4fd7-9e54-e40437f67c0b-mcd-auth-proxy-config\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296054 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-cni-binary-copy\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296068 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-netns\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-cni-bin\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-multus-certs\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-k8s-cni-cncf-io\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296126 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1acef947-6310-4ac0-bc84-a06d91f84cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296140 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-cnibin\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296156 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4sv\" (UniqueName: \"kubernetes.io/projected/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-kube-api-access-8d4sv\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296185 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-socket-dir-parent\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296201 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh6ph\" (UniqueName: \"kubernetes.io/projected/67794a44-d793-4fd7-9e54-e40437f67c0b-kube-api-access-lh6ph\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296220 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntsst\" (UniqueName: \"kubernetes.io/projected/1acef947-6310-4ac0-bc84-a06d91f84cb6-kube-api-access-ntsst\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296244 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/67794a44-d793-4fd7-9e54-e40437f67c0b-rootfs\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-conf-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.296931 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-etc-kubernetes\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297001 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-system-cni-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297019 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-cni-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297019 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/67794a44-d793-4fd7-9e54-e40437f67c0b-mcd-auth-proxy-config\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297057 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-cnibin\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297074 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1acef947-6310-4ac0-bc84-a06d91f84cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297129 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-os-release\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297150 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-cni-multus\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297188 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67794a44-d793-4fd7-9e54-e40437f67c0b-proxy-tls\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.297259 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-os-release\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.302511 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/67794a44-d793-4fd7-9e54-e40437f67c0b-proxy-tls\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.320897 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.328340 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh6ph\" (UniqueName: \"kubernetes.io/projected/67794a44-d793-4fd7-9e54-e40437f67c0b-kube-api-access-lh6ph\") pod \"machine-config-daemon-4hwxn\" (UID: \"67794a44-d793-4fd7-9e54-e40437f67c0b\") " pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.334625 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.347960 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.362646 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.374475 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.390018 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-cni-bin\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398221 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-multus-certs\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-k8s-cni-cncf-io\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398311 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1acef947-6310-4ac0-bc84-a06d91f84cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398338 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-cnibin\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-multus-certs\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398371 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4sv\" (UniqueName: \"kubernetes.io/projected/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-kube-api-access-8d4sv\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398396 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntsst\" (UniqueName: \"kubernetes.io/projected/1acef947-6310-4ac0-bc84-a06d91f84cb6-kube-api-access-ntsst\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398416 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-socket-dir-parent\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398436 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398455 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-conf-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398496 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-etc-kubernetes\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-cni-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-cnibin\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398542 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1acef947-6310-4ac0-bc84-a06d91f84cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398557 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-system-cni-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398574 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398588 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-os-release\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398610 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-cni-multus\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398632 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-os-release\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-hostroot\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398693 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-daemon-config\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-kubelet\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398735 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-cni-binary-copy\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398752 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-netns\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398811 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-netns\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398862 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-cnibin\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398856 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-cni-bin\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398936 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1acef947-6310-4ac0-bc84-a06d91f84cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.398977 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-run-k8s-cni-cncf-io\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-system-cni-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399134 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-hostroot\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399204 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-os-release\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399240 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-kubelet\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399486 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-socket-dir-parent\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399604 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399650 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-os-release\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399677 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-host-var-lib-cni-multus\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399727 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-cni-binary-copy\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399747 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-daemon-config\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399766 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-conf-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-etc-kubernetes\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399792 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-cnibin\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.399789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-multus-cni-dir\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.400126 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1acef947-6310-4ac0-bc84-a06d91f84cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.400214 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1acef947-6310-4ac0-bc84-a06d91f84cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.400207 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.412312 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.414792 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4sv\" (UniqueName: \"kubernetes.io/projected/87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8-kube-api-access-8d4sv\") pod \"multus-x7jwx\" (UID: \"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\") " pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.415862 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntsst\" (UniqueName: \"kubernetes.io/projected/1acef947-6310-4ac0-bc84-a06d91f84cb6-kube-api-access-ntsst\") pod \"multus-additional-cni-plugins-c7pdz\" (UID: \"1acef947-6310-4ac0-bc84-a06d91f84cb6\") " pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.424377 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.437245 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.438346 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.449114 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" Jan 27 15:07:17 crc kubenswrapper[4772]: W0127 15:07:17.450519 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67794a44_d793_4fd7_9e54_e40437f67c0b.slice/crio-fd0ddca1b8be2e12111ee0e50ac1b53dde3dd038c4cc631af4b78ec5dddd73cf WatchSource:0}: Error finding container fd0ddca1b8be2e12111ee0e50ac1b53dde3dd038c4cc631af4b78ec5dddd73cf: Status 404 returned error can't find the container with id fd0ddca1b8be2e12111ee0e50ac1b53dde3dd038c4cc631af4b78ec5dddd73cf Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.452316 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.460151 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x7jwx" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.475002 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: W0127 15:07:17.475374 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87cb2a5b_099e_4a3b_a0bc_cba76a1a00a8.slice/crio-7cd942ff56dee37fd1e7c9671a18a0a4c9e67dd558fe1ef941783ed45ca45ea6 WatchSource:0}: Error finding container 7cd942ff56dee37fd1e7c9671a18a0a4c9e67dd558fe1ef941783ed45ca45ea6: Status 404 returned error can't find the container with id 7cd942ff56dee37fd1e7c9671a18a0a4c9e67dd558fe1ef941783ed45ca45ea6 Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.490653 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.505301 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.519791 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n2khk"] Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.521742 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.523765 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.524279 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.524471 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.524890 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.525010 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.525131 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.526294 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.538175 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.588300 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601515 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-ovn\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-log-socket\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601649 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-slash\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601698 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-bin\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovn-node-metrics-cert\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601732 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dt6g\" (UniqueName: \"kubernetes.io/projected/736264c8-cd18-479a-88ba-e1ec15dbfdae-kube-api-access-2dt6g\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-kubelet\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601772 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-env-overrides\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601791 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-script-lib\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601812 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-systemd-units\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601829 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-systemd\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601852 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-etc-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-ovn-kubernetes\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601899 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-netns\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.601965 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-node-log\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.602023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-netd\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.602058 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-var-lib-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.602083 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-config\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.602133 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.621361 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.628072 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 01:35:13.127747721 +0000 UTC Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.637813 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.657955 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.662832 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:17 crc kubenswrapper[4772]: E0127 15:07:17.662969 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.673367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.692970 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702577 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702620 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-ovn\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702638 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-log-socket\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-slash\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702693 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-bin\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702724 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovn-node-metrics-cert\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-kubelet\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702753 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dt6g\" (UniqueName: \"kubernetes.io/projected/736264c8-cd18-479a-88ba-e1ec15dbfdae-kube-api-access-2dt6g\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702773 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-env-overrides\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-script-lib\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702802 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-systemd-units\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702817 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-systemd\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-etc-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702845 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-ovn-kubernetes\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702868 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-netns\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-node-log\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702901 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-netd\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702916 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-var-lib-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.702930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-config\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703559 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-config\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703608 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703633 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-ovn\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703655 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-log-socket\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-slash\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703696 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.703716 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-bin\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.704454 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-systemd\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.704904 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-netns\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.704956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-etc-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.704980 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-ovn-kubernetes\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705003 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-systemd-units\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705022 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-env-overrides\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705067 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-netd\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705092 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-node-log\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705114 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-var-lib-openvswitch\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705130 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-kubelet\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.705138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-script-lib\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.707834 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovn-node-metrics-cert\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.721436 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.724431 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dt6g\" (UniqueName: \"kubernetes.io/projected/736264c8-cd18-479a-88ba-e1ec15dbfdae-kube-api-access-2dt6g\") pod \"ovnkube-node-n2khk\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.736206 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.754425 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.765659 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.781083 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.794714 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.807547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.809837 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerStarted","Data":"ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.809799 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.809883 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerStarted","Data":"7cd942ff56dee37fd1e7c9671a18a0a4c9e67dd558fe1ef941783ed45ca45ea6"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.811258 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q46tm" event={"ID":"fed65bae-f1c4-4c97-bb6d-d4144fe2532b","Type":"ContainerStarted","Data":"6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.813335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dtdj6" event={"ID":"95a893d4-4faa-40b2-b505-9698fe428ba8","Type":"ContainerStarted","Data":"6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.814861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerStarted","Data":"af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.814892 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerStarted","Data":"85e2f41d41be6c7cd6e458ef932bea5281182263cb6d8eb0efde7f5c3f5f3224"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.816735 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.816791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.816805 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"fd0ddca1b8be2e12111ee0e50ac1b53dde3dd038c4cc631af4b78ec5dddd73cf"} Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.832316 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.845040 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.858944 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.872870 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.887784 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.898102 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.916643 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.921949 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: W0127 15:07:17.932352 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod736264c8_cd18_479a_88ba_e1ec15dbfdae.slice/crio-b6a209d8fc4e180971a6f92a0f3c7493472a2095b6c5303a9b0ce0f4e62056a9 WatchSource:0}: Error finding container b6a209d8fc4e180971a6f92a0f3c7493472a2095b6c5303a9b0ce0f4e62056a9: Status 404 returned error can't find the container with id b6a209d8fc4e180971a6f92a0f3c7493472a2095b6c5303a9b0ce0f4e62056a9 Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.942578 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:17 crc kubenswrapper[4772]: I0127 15:07:17.965436 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:17Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.003706 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.047293 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.089870 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.126804 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.164869 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.209873 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.245258 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.412123 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412298 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:07:22.41227002 +0000 UTC m=+28.392879118 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.412384 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.412419 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.412453 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.412470 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412578 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412590 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412600 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412601 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412635 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:22.41262848 +0000 UTC m=+28.393237568 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412661 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412684 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412697 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412710 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412664 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:22.41264429 +0000 UTC m=+28.393253438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412892 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:22.412853486 +0000 UTC m=+28.393462634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.412913 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:22.412902948 +0000 UTC m=+28.393512146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.629063 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 00:22:47.813012344 +0000 UTC Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.662932 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.663014 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.663290 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:18 crc kubenswrapper[4772]: E0127 15:07:18.663417 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.821904 4772 generic.go:334] "Generic (PLEG): container finished" podID="1acef947-6310-4ac0-bc84-a06d91f84cb6" containerID="af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538" exitCode=0 Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.821971 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerDied","Data":"af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538"} Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.824758 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" exitCode=0 Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.825017 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.825198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"b6a209d8fc4e180971a6f92a0f3c7493472a2095b6c5303a9b0ce0f4e62056a9"} Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.842154 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.858039 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.873886 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.887348 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.900795 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.916412 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.929824 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.947824 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.961307 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.975297 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:18 crc kubenswrapper[4772]: I0127 15:07:18.991139 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:18Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.008449 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.027531 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.041042 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.057453 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.069902 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.081185 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.094802 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.108839 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.134432 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.149841 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.162215 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.176978 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.204076 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.244113 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.285556 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.328093 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.396086 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.418047 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.451389 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.629309 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 08:01:38.785341056 +0000 UTC Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.662865 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:19 crc kubenswrapper[4772]: E0127 15:07:19.662981 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.835793 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.836074 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.836186 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.836286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.836375 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.836461 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.838027 4772 generic.go:334] "Generic (PLEG): container finished" podID="1acef947-6310-4ac0-bc84-a06d91f84cb6" containerID="e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1" exitCode=0 Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.838141 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerDied","Data":"e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1"} Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.856315 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.877711 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.892824 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.907140 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.931299 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.944387 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.956605 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.974483 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.987991 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:19 crc kubenswrapper[4772]: I0127 15:07:19.999218 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.009506 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.024675 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.040327 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.052455 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.070360 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.629641 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 00:29:12.132567393 +0000 UTC Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.662204 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.662237 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.662359 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.662573 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.843116 4772 generic.go:334] "Generic (PLEG): container finished" podID="1acef947-6310-4ac0-bc84-a06d91f84cb6" containerID="4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc" exitCode=0 Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.843194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerDied","Data":"4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.846571 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.848429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.848470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.848484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.848605 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.855562 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.855597 4772 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.855967 4772 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.857003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.857031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.857041 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.857056 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.857065 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:20Z","lastTransitionTime":"2026-01-27T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.872913 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.878283 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.881964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.881996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.882007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.882037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.882051 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:20Z","lastTransitionTime":"2026-01-27T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.890189 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.899732 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.902506 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.904507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.904553 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.904565 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.904583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.904596 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:20Z","lastTransitionTime":"2026-01-27T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.916797 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.918641 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.924698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.924941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.925040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.925125 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.925220 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:20Z","lastTransitionTime":"2026-01-27T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.932796 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.938117 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.942134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.942185 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.942197 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.942213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.942223 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:20Z","lastTransitionTime":"2026-01-27T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.943356 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.955615 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: E0127 15:07:20.955729 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.957435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.957482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.957493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.957510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.957523 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:20Z","lastTransitionTime":"2026-01-27T15:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.965531 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.978887 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:20 crc kubenswrapper[4772]: I0127 15:07:20.990830 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:20Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.004047 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.016151 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.035041 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.049131 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.059226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.059329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.059386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.059443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.059498 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.069894 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.162207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.162247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.162258 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.162275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.162287 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.264091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.264370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.264454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.264521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.264585 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.367047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.367328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.367410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.367492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.367553 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.469820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.469854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.469863 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.469878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.469886 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.572605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.572664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.572677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.572707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.572719 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.630357 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 10:40:59.157086387 +0000 UTC Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.662905 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:21 crc kubenswrapper[4772]: E0127 15:07:21.663047 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.674848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.675010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.675037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.675064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.675080 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.777019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.777055 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.777068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.777083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.777093 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.847762 4772 generic.go:334] "Generic (PLEG): container finished" podID="1acef947-6310-4ac0-bc84-a06d91f84cb6" containerID="3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd" exitCode=0 Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.847812 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerDied","Data":"3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.851777 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.879844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.879883 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.879891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.879905 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.879915 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.882695 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.895907 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.906850 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.921129 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.935460 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.948961 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.964577 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.980674 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.982047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.982083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.982292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.982325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.982362 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:21Z","lastTransitionTime":"2026-01-27T15:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:21 crc kubenswrapper[4772]: I0127 15:07:21.994631 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:21Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.010773 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.029888 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.046263 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.060580 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.075724 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.084670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.084708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.084718 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.084731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.084741 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.087859 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.186923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.186958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.186966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.186979 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.186988 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.289668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.289716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.289724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.289740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.289750 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.392146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.392215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.392252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.392269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.392278 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.449737 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.449851 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.449882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.449916 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.449937 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450051 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450097 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450105 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450053 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:07:30.450025756 +0000 UTC m=+36.430634894 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450110 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450190 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450127 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450247 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:30.450219712 +0000 UTC m=+36.430828830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450281 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:30.450263363 +0000 UTC m=+36.430872561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450255 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450060 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450368 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:30.450356156 +0000 UTC m=+36.430965334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.450394 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:30.450382146 +0000 UTC m=+36.430991324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.494024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.494056 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.494067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.494081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.494102 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.596993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.597056 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.597074 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.597099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.597121 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.630561 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 07:09:42.105757713 +0000 UTC Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.662367 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.662514 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.662595 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:22 crc kubenswrapper[4772]: E0127 15:07:22.662747 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.700035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.700062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.700071 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.700084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.700092 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.802777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.802820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.802829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.802844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.802853 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.859694 4772 generic.go:334] "Generic (PLEG): container finished" podID="1acef947-6310-4ac0-bc84-a06d91f84cb6" containerID="c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57" exitCode=0 Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.859743 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerDied","Data":"c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.873351 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.885760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.904084 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.915718 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.915757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.915771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.915788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.915800 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:22Z","lastTransitionTime":"2026-01-27T15:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.917531 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.935699 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:22 crc kubenswrapper[4772]: I0127 15:07:22.954094 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:22.964789 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:22.984861 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:22.997149 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.009514 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.018689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.018724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.018733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.018748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.018757 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.023586 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.037469 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.056874 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.070128 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.080114 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.121945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.121983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.121996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.122013 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.122026 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.224477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.224529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.224538 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.224551 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.224561 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.327462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.327509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.327522 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.327542 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.327557 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.431495 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.431540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.431555 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.431577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.431594 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.534156 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.534385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.534468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.534551 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.534627 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.631016 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 18:36:14.169139746 +0000 UTC Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.636880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.636921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.636933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.636949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.636960 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.662888 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:23 crc kubenswrapper[4772]: E0127 15:07:23.663022 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.739318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.739357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.739368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.739384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.739396 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.841802 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.841868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.841880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.841897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.841908 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.870821 4772 generic.go:334] "Generic (PLEG): container finished" podID="1acef947-6310-4ac0-bc84-a06d91f84cb6" containerID="d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8" exitCode=0 Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.870866 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerDied","Data":"d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.882958 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.905945 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.920256 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.933958 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.945047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.945082 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.945090 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.945104 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.945114 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:23Z","lastTransitionTime":"2026-01-27T15:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.955556 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.973434 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.983713 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:23 crc kubenswrapper[4772]: I0127 15:07:23.996305 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.008225 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.021604 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.033025 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.048080 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.051366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.051389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.051398 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.051411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.051419 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.061912 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.074936 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.091587 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.153353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.153383 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.153393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.153409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.153421 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.256010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.256051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.256061 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.256080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.256091 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.358955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.358999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.359010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.359027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.359038 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.456234 4772 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.461991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.462033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.462045 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.462061 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.462074 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.564680 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.564722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.564734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.564750 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.564758 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.632540 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 19:34:51.348688904 +0000 UTC Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.662905 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.663179 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:24 crc kubenswrapper[4772]: E0127 15:07:24.663365 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:24 crc kubenswrapper[4772]: E0127 15:07:24.663793 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.671044 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.671100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.671112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.671126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.671137 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.684910 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.701828 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.718474 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.732287 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.752209 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.764387 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.774006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.774051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.774061 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.774076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.774086 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.775066 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.791734 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.811574 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.826563 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.838673 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.853250 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.870528 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.879602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.879648 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.879656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.879673 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.879682 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.882066 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.884249 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" event={"ID":"1acef947-6310-4ac0-bc84-a06d91f84cb6","Type":"ContainerStarted","Data":"790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.889443 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.889695 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.893948 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.907737 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.913232 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.920557 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.930365 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.938358 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.950602 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.961095 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.971999 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.983508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.983552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.983564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.983582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.983597 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:24Z","lastTransitionTime":"2026-01-27T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:24 crc kubenswrapper[4772]: I0127 15:07:24.989534 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.000068 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.011031 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.024511 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.036653 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.056013 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.069142 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.078391 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.085723 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.085775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.085785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.085800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.085813 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.096479 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.108714 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.116504 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.127153 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.139005 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.149723 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.160810 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.175323 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.188762 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.188800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.188811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.188827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.188838 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.189321 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.199815 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.217734 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.231767 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.245125 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.261521 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.274726 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.291629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.291693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.291710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.291733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.291752 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.394217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.394270 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.394282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.394302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.394315 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.496290 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.496333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.496344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.496360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.496368 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.598832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.598882 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.598898 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.598947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.598961 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.633423 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:39:04.687424573 +0000 UTC Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.662810 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:25 crc kubenswrapper[4772]: E0127 15:07:25.662998 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.701759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.701798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.701809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.701825 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.701835 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.804672 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.804996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.805098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.805228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.805343 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.892506 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.893030 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.909397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.909459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.909471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.909489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.909499 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:25Z","lastTransitionTime":"2026-01-27T15:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.917964 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.931449 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.946727 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.959471 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:25 crc kubenswrapper[4772]: I0127 15:07:25.972313 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.005965 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.011600 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.011642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.011654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.011672 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.011686 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.033157 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.052293 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.072484 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.085350 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.106030 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.114467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.114514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.114523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.114538 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.114548 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.121494 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.133968 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.153295 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.164736 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.173489 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:26Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.216912 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.216962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.216975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.216994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.217007 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.320112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.320154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.320198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.320232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.320266 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.422311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.422360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.422372 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.422389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.422401 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.524675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.524745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.524757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.524770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.524796 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.627773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.627812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.627823 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.627838 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.627848 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.634183 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 20:05:07.996061563 +0000 UTC Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.662595 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.662657 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:26 crc kubenswrapper[4772]: E0127 15:07:26.662757 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:26 crc kubenswrapper[4772]: E0127 15:07:26.663061 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.730117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.730156 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.730181 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.730198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.730209 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.832638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.832679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.832693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.832709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.832720 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.895468 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.985773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.985822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.985831 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.985845 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:26 crc kubenswrapper[4772]: I0127 15:07:26.985855 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:26Z","lastTransitionTime":"2026-01-27T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.088596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.088627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.088638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.088652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.088661 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.191200 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.191248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.191257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.191270 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.191280 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.295591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.296687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.296770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.296839 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.296850 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.399388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.399426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.399434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.399448 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.399456 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.502360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.502424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.502442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.502465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.502482 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.604378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.604403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.604413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.604428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.604437 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.634475 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 22:03:47.636543572 +0000 UTC Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.661978 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:27 crc kubenswrapper[4772]: E0127 15:07:27.662198 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.707706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.707765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.707780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.707801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.707814 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.810763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.810810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.810821 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.810837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.810849 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.900496 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/0.log" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.903109 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec" exitCode=1 Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.903182 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.903721 4772 scope.go:117] "RemoveContainer" containerID="a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.912964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.913011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.913022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.913038 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.913051 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:27Z","lastTransitionTime":"2026-01-27T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.919943 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.930504 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.943459 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.955865 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.967487 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:27 crc kubenswrapper[4772]: I0127 15:07:27.979781 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.001575 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:27Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0127 15:07:27.137663 6049 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:07:27.137944 6049 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:07:27.138014 6049 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0127 15:07:27.138393 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 15:07:27.138457 6049 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 15:07:27.138485 6049 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 15:07:27.138538 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 15:07:27.138563 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 15:07:27.138584 6049 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 15:07:27.138489 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 15:07:27.138607 6049 factory.go:656] Stopping watch factory\\\\nI0127 15:07:27.138647 6049 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:27Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.015751 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.015790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.015802 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.015817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.015851 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.018367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.040424 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.058212 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.075048 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.087782 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.097669 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.119231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.119269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.119281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.119296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.119307 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.122248 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.133072 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.221820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.221902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.221913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.221933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.221946 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.376128 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.376176 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.376186 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.376201 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.376212 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.482120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.482595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.482611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.482631 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.482646 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.585246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.585285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.585296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.585310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.585320 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.634955 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:13:17.435613329 +0000 UTC Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.662361 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:28 crc kubenswrapper[4772]: E0127 15:07:28.662475 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.662790 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:28 crc kubenswrapper[4772]: E0127 15:07:28.662863 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.687562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.687597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.687607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.687620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.687630 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.790011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.790092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.790109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.790126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.790160 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.892321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.892363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.892374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.892388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.892398 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.908083 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/0.log" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.910646 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97"} Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.910749 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.927811 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.940311 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.951445 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.971760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:27Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0127 15:07:27.137663 6049 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:07:27.137944 6049 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:07:27.138014 6049 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0127 15:07:27.138393 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 15:07:27.138457 6049 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 15:07:27.138485 6049 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 15:07:27.138538 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 15:07:27.138563 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 15:07:27.138584 6049 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 15:07:27.138489 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 15:07:27.138607 6049 factory.go:656] Stopping watch factory\\\\nI0127 15:07:27.138647 6049 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.991507 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:28Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.995244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.995317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.995331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.995397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:28 crc kubenswrapper[4772]: I0127 15:07:28.995474 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:28Z","lastTransitionTime":"2026-01-27T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.008549 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.022616 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.037076 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.055815 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.073040 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx"] Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.073705 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.075588 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.075685 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.075789 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.081467 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3cc8fde5-4905-4fb1-b683-27ea4921b462-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.081560 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghx7x\" (UniqueName: \"kubernetes.io/projected/3cc8fde5-4905-4fb1-b683-27ea4921b462-kube-api-access-ghx7x\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.081599 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3cc8fde5-4905-4fb1-b683-27ea4921b462-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.081641 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3cc8fde5-4905-4fb1-b683-27ea4921b462-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.086023 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.097717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.097748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.097757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.097773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.097784 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.099256 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.109567 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.118525 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.127855 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.140134 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.153226 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.162873 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.182878 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghx7x\" (UniqueName: \"kubernetes.io/projected/3cc8fde5-4905-4fb1-b683-27ea4921b462-kube-api-access-ghx7x\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.182933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3cc8fde5-4905-4fb1-b683-27ea4921b462-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.182971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3cc8fde5-4905-4fb1-b683-27ea4921b462-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.183043 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3cc8fde5-4905-4fb1-b683-27ea4921b462-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.183698 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3cc8fde5-4905-4fb1-b683-27ea4921b462-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.184457 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3cc8fde5-4905-4fb1-b683-27ea4921b462-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.186923 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:27Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0127 15:07:27.137663 6049 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:07:27.137944 6049 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:07:27.138014 6049 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0127 15:07:27.138393 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 15:07:27.138457 6049 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 15:07:27.138485 6049 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 15:07:27.138538 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 15:07:27.138563 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 15:07:27.138584 6049 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 15:07:27.138489 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 15:07:27.138607 6049 factory.go:656] Stopping watch factory\\\\nI0127 15:07:27.138647 6049 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.190140 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3cc8fde5-4905-4fb1-b683-27ea4921b462-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.200353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.200542 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.200599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.200656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.200724 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.202284 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghx7x\" (UniqueName: \"kubernetes.io/projected/3cc8fde5-4905-4fb1-b683-27ea4921b462-kube-api-access-ghx7x\") pod \"ovnkube-control-plane-749d76644c-wkvpx\" (UID: \"3cc8fde5-4905-4fb1-b683-27ea4921b462\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.202937 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.215349 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.229555 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.241577 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.258450 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.269294 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.279861 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.291347 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.302219 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.304283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.304409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.304515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.304617 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.304707 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.312752 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.322477 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.333277 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.385518 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.406953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.407007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.407023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.407045 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.407063 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.509395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.509422 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.509431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.509445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.509455 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.612033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.612066 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.612076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.612091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.612100 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.635955 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 00:18:35.166323642 +0000 UTC Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.662395 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:29 crc kubenswrapper[4772]: E0127 15:07:29.662516 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.714960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.714999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.715007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.715021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.715031 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.817443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.817487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.817498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.817514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.817526 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.915811 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/1.log" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.916577 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/0.log" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.918961 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.918997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.919009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.919025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.919036 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:29Z","lastTransitionTime":"2026-01-27T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.919520 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97" exitCode=1 Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.919580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.919608 4772 scope.go:117] "RemoveContainer" containerID="a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.920190 4772 scope.go:117] "RemoveContainer" containerID="44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97" Jan 27 15:07:29 crc kubenswrapper[4772]: E0127 15:07:29.920344 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.923487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" event={"ID":"3cc8fde5-4905-4fb1-b683-27ea4921b462","Type":"ContainerStarted","Data":"7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.923540 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" event={"ID":"3cc8fde5-4905-4fb1-b683-27ea4921b462","Type":"ContainerStarted","Data":"1d0c05a9e18a9cfab5b84d6eca0e5dd9ecaea00074297e641ad4ddb1a294ea8e"} Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.932423 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.946353 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.956449 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.966134 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.977857 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:29 crc kubenswrapper[4772]: I0127 15:07:29.998380 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.014750 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.021310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.021338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.021347 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.021360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.021370 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.032770 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.054705 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:27Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0127 15:07:27.137663 6049 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:07:27.137944 6049 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:07:27.138014 6049 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0127 15:07:27.138393 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 15:07:27.138457 6049 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 15:07:27.138485 6049 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 15:07:27.138538 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 15:07:27.138563 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 15:07:27.138584 6049 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 15:07:27.138489 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 15:07:27.138607 6049 factory.go:656] Stopping watch factory\\\\nI0127 15:07:27.138647 6049 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.067880 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.080331 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.095101 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.107763 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.123384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.123418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.123429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.123443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.123454 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.126572 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.140787 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.154818 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.225760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.225801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.225810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.225825 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.225834 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.328463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.328504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.328515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.328529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.328537 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.431002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.431345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.431420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.431529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.431625 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.493941 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.494051 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.494077 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.494107 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.494127 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494246 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494294 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:46.494280857 +0000 UTC m=+52.474889955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494350 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:07:46.494301887 +0000 UTC m=+52.474910995 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494594 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494617 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494626 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494651 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494663 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:46.494653517 +0000 UTC m=+52.475262825 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.494736 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:46.494723739 +0000 UTC m=+52.475333017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.495048 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.495152 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.495273 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.495518 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:46.495492361 +0000 UTC m=+52.476101469 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.534380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.534443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.534455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.534478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.534493 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.546469 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ql2vx"] Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.547007 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.547739 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.562397 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.576095 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.587796 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.594927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8l57\" (UniqueName: \"kubernetes.io/projected/371016c8-5a23-427d-aa0a-0faa241d86a7-kube-api-access-v8l57\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.594985 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.598772 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.619096 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.634623 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.636082 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 11:01:09.233295837 +0000 UTC Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.636534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.636622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.636636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.636657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.636671 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.646248 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.660245 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.662355 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.662365 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.662547 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.662470 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.673568 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.684263 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.694997 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.695609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8l57\" (UniqueName: \"kubernetes.io/projected/371016c8-5a23-427d-aa0a-0faa241d86a7-kube-api-access-v8l57\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.695662 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.695802 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.695876 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:31.195856915 +0000 UTC m=+37.176466013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.704732 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.714315 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.714857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8l57\" (UniqueName: \"kubernetes.io/projected/371016c8-5a23-427d-aa0a-0faa241d86a7-kube-api-access-v8l57\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.726875 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.738766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.738808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.738820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.738837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.738853 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.740600 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.749498 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.764324 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a1941175c2adff112ad9fafa1e24e1fec6c564305bb9ca8a437e2f0e9124dfec\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:27Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0127 15:07:27.137663 6049 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:07:27.137944 6049 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:07:27.138014 6049 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0127 15:07:27.138393 6049 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0127 15:07:27.138457 6049 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0127 15:07:27.138485 6049 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0127 15:07:27.138538 6049 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0127 15:07:27.138563 6049 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 15:07:27.138584 6049 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0127 15:07:27.138489 6049 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 15:07:27.138607 6049 factory.go:656] Stopping watch factory\\\\nI0127 15:07:27.138647 6049 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:07:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.844516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.844609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.844645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.844689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.844714 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.928863 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/1.log" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.932533 4772 scope.go:117] "RemoveContainer" containerID="44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97" Jan 27 15:07:30 crc kubenswrapper[4772]: E0127 15:07:30.932688 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.935158 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" event={"ID":"3cc8fde5-4905-4fb1-b683-27ea4921b462","Type":"ContainerStarted","Data":"545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.945068 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.947337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.947372 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.947384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.947400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.947411 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:30Z","lastTransitionTime":"2026-01-27T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.956972 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:30 crc kubenswrapper[4772]: I0127 15:07:30.983962 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:30Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.008064 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.022241 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.036680 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.049428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.049461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.049469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.049482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.049491 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.051850 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.065426 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.075114 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.093638 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.107243 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.119245 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.129667 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.140182 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.151036 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.151740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.151766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.151778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.151796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.151806 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.163190 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.175740 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.194627 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.200242 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.200371 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.200432 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:32.200414192 +0000 UTC m=+38.181023290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.207985 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.219628 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.231330 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.245687 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.253915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.253948 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.253958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.253973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.253985 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.261034 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.278940 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.295378 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.316451 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.321450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.321493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.321503 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.321518 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.321526 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.335007 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.335230 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.339192 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.339249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.339262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.339279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.339292 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.346540 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.351510 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.355198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.355241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.355256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.355275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.355288 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.360469 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.366720 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.370685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.370739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.370751 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.370770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.370783 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.371245 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.382727 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.384076 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.386757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.386784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.386793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.386807 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.386816 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.398952 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.400650 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.400824 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.402580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.402621 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.402630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.402647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.402659 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.412685 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.423494 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.504716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.504759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.504769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.504785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.504796 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.606733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.606773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.606782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.606797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.606806 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.636208 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 06:54:38.358478323 +0000 UTC Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.662657 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:31 crc kubenswrapper[4772]: E0127 15:07:31.662829 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.663717 4772 scope.go:117] "RemoveContainer" containerID="a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.709407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.709443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.709454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.709469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.709480 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.812410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.812855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.812874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.812897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.812915 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.915337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.915385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.915396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.915412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.915424 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:31Z","lastTransitionTime":"2026-01-27T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.939446 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.941047 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08"} Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.954789 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.969307 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:31 crc kubenswrapper[4772]: I0127 15:07:31.983707 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.000353 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:31Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.012757 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.025314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.025348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.025360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.025375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.025386 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.034415 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.057332 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.069769 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.085574 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.098710 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.108161 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.119977 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.127557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.127595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.127603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.127616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.127626 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.132582 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.144452 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.155298 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.173665 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.189838 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.208677 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:32 crc kubenswrapper[4772]: E0127 15:07:32.208848 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:32 crc kubenswrapper[4772]: E0127 15:07:32.208913 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:34.208895959 +0000 UTC m=+40.189505067 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.229572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.229608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.229618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.229634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.229646 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.331792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.331831 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.331842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.331858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.331869 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.434468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.434525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.434546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.434573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.434594 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.536530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.536568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.536578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.536593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.536602 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.636487 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:30:48.071867813 +0000 UTC Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.639403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.639429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.639438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.639451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.639461 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.665290 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.665369 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:32 crc kubenswrapper[4772]: E0127 15:07:32.665464 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.665532 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:32 crc kubenswrapper[4772]: E0127 15:07:32.665679 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:32 crc kubenswrapper[4772]: E0127 15:07:32.666276 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.742494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.742534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.742543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.742558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.742569 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.845883 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.845959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.846226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.846259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.846283 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.948756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.948801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.948811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.948828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:32 crc kubenswrapper[4772]: I0127 15:07:32.948841 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:32Z","lastTransitionTime":"2026-01-27T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.051092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.051132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.051147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.051192 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.051209 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.153612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.153676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.153699 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.153719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.153733 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.256438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.256478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.256487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.256500 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.256509 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.359536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.359596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.359625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.359673 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.359696 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.461919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.461962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.461970 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.461985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.461996 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.564253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.564295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.564307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.564326 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.564338 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.637443 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 19:59:45.302434912 +0000 UTC Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.662096 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:33 crc kubenswrapper[4772]: E0127 15:07:33.662257 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.667041 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.667106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.667117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.667129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.667211 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.769625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.769664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.769677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.769694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.769705 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.872428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.872461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.872471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.872484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.872493 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.975240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.975284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.975293 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.975308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:33 crc kubenswrapper[4772]: I0127 15:07:33.975317 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:33Z","lastTransitionTime":"2026-01-27T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.079897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.079950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.079968 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.079992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.080010 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.181695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.181731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.181740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.181752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.181762 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.230212 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:34 crc kubenswrapper[4772]: E0127 15:07:34.230373 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:34 crc kubenswrapper[4772]: E0127 15:07:34.230428 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:38.230413921 +0000 UTC m=+44.211023019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.284511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.284545 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.284556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.284570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.284581 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.387197 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.387239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.387250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.387267 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.387279 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.489405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.489453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.489465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.489484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.489495 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.591975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.592009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.592020 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.592035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.592045 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.637628 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 11:57:51.124484803 +0000 UTC Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.662916 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:34 crc kubenswrapper[4772]: E0127 15:07:34.663033 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.663325 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:34 crc kubenswrapper[4772]: E0127 15:07:34.663405 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.663454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:34 crc kubenswrapper[4772]: E0127 15:07:34.663531 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.677286 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.689377 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.693576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.693613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.693624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.693637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.693645 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.699573 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.717253 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.731828 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.746575 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.759676 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.771016 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.788469 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.796773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.796808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.796819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.796835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.796846 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.802287 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.813927 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.829584 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.849545 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.867797 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.889868 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.900769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.900804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.900814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.900829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.900840 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:34Z","lastTransitionTime":"2026-01-27T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.905400 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:34 crc kubenswrapper[4772]: I0127 15:07:34.917606 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.003698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.003736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.003748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.003766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.003779 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.138317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.138780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.138871 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.138945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.139106 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.241882 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.241933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.241946 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.241964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.241977 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.344742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.344789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.344798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.344941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.344964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.447392 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.447450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.447466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.447484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.447498 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.549808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.549842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.549858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.549875 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.549885 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.638452 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 21:15:00.125298241 +0000 UTC Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.651470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.651503 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.651514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.651531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.651543 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.662759 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:35 crc kubenswrapper[4772]: E0127 15:07:35.662902 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.754368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.754400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.754409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.754422 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.754430 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.856307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.856355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.856369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.856385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.856394 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.958480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.958539 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.958547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.958562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:35 crc kubenswrapper[4772]: I0127 15:07:35.958573 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:35Z","lastTransitionTime":"2026-01-27T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.060595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.060631 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.060645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.060661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.060674 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.162667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.162701 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.162711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.162726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.162736 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.265975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.266058 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.266081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.266106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.266123 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.369413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.369455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.369465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.369482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.369492 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.472897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.472969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.472990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.473016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.473037 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.575531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.576323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.576384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.576516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.576583 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.639325 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 02:23:52.257149762 +0000 UTC Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.662257 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.662273 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.662478 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:36 crc kubenswrapper[4772]: E0127 15:07:36.662560 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:36 crc kubenswrapper[4772]: E0127 15:07:36.662653 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:36 crc kubenswrapper[4772]: E0127 15:07:36.662754 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.680210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.680269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.680280 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.680297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.680309 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.782747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.782784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.782797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.782811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.782822 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.885569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.885612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.885630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.885648 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.885659 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.988006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.988047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.988062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.988081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:36 crc kubenswrapper[4772]: I0127 15:07:36.988094 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:36Z","lastTransitionTime":"2026-01-27T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.091620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.091689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.091714 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.091745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.091769 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.195087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.195149 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.195371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.195393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.195409 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.297854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.297895 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.297904 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.297917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.297927 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.401221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.401286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.401310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.401335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.401350 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.504999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.505042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.505054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.505081 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.505093 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.607508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.607564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.607577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.607597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.607611 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.639840 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:47:17.14797465 +0000 UTC Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.662522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:37 crc kubenswrapper[4772]: E0127 15:07:37.662655 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.709695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.709751 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.709761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.709778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.709791 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.813010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.813063 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.813073 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.813086 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.813095 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.916210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.916259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.916271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.916287 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:37 crc kubenswrapper[4772]: I0127 15:07:37.916298 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:37Z","lastTransitionTime":"2026-01-27T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.018944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.019302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.019313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.019328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.019340 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.121927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.121993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.122012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.122040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.122058 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.223971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.224035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.224052 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.224076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.224095 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.263378 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.282118 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:38 crc kubenswrapper[4772]: E0127 15:07:38.282399 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:38 crc kubenswrapper[4772]: E0127 15:07:38.282482 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:07:46.28246009 +0000 UTC m=+52.263069228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.327151 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.327305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.327323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.327351 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.327369 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.430607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.430653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.430663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.430678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.430689 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.534029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.534079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.534090 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.534130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.534144 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.637039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.637096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.637114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.637138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.637156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.640594 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 21:21:05.69207103 +0000 UTC Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.661916 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.661921 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:38 crc kubenswrapper[4772]: E0127 15:07:38.662086 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.662110 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:38 crc kubenswrapper[4772]: E0127 15:07:38.662195 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:38 crc kubenswrapper[4772]: E0127 15:07:38.662283 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.740316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.740393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.740413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.740755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.741033 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.844451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.844570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.844591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.844618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.844635 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.948297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.948381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.948393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.948411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:38 crc kubenswrapper[4772]: I0127 15:07:38.948425 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:38Z","lastTransitionTime":"2026-01-27T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.050881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.050943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.050960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.050976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.050988 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.152860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.152914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.152930 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.152962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.152980 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.255872 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.255927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.255939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.255962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.256009 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.362757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.362848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.362889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.362923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.362947 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.466303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.466351 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.466367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.466387 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.466402 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.569249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.569296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.569310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.569332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.569347 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.641354 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 10:31:19.011289632 +0000 UTC Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.662853 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:39 crc kubenswrapper[4772]: E0127 15:07:39.663065 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.671988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.672022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.672031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.672043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.672052 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.773882 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.773930 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.773947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.773964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.773976 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.876267 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.876302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.876311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.876325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.876334 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.978532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.978629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.978646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.978668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:39 crc kubenswrapper[4772]: I0127 15:07:39.978684 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:39Z","lastTransitionTime":"2026-01-27T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.081490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.081536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.081549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.081568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.081582 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.183894 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.183928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.183939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.183952 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.183961 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.286505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.286759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.286837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.286899 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.286955 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.389654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.389689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.389697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.389710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.389718 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.492203 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.492443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.492512 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.492580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.492636 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.594896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.594927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.594935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.594947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.594956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.641470 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 13:43:14.017470977 +0000 UTC Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.662801 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.662831 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:40 crc kubenswrapper[4772]: E0127 15:07:40.662928 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.663076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:40 crc kubenswrapper[4772]: E0127 15:07:40.663125 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:40 crc kubenswrapper[4772]: E0127 15:07:40.663456 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.696909 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.696944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.696954 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.696969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.696979 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.799070 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.799340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.799513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.799608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.799689 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.902060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.902358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.902510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.902672 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:40 crc kubenswrapper[4772]: I0127 15:07:40.902793 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:40Z","lastTransitionTime":"2026-01-27T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.006528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.006569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.006581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.006597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.006607 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.109445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.109495 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.109507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.109525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.109536 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.211717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.211760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.211769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.211783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.211792 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.314392 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.314496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.314513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.314530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.314542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.417460 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.417498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.417509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.417527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.417540 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.423074 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.423101 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.423114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.423128 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.423138 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.434406 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.438064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.438099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.438112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.438130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.438140 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.450119 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.453085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.453120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.453128 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.453141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.453149 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.463659 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.467085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.467120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.467131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.467145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.467183 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.476845 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.479437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.479468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.479481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.479499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.479511 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.490079 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.490372 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.519557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.519595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.519606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.519622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.519633 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.622401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.622644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.622770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.622867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.622955 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.641760 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 11:56:29.40995986 +0000 UTC Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.662224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:41 crc kubenswrapper[4772]: E0127 15:07:41.662399 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.663120 4772 scope.go:117] "RemoveContainer" containerID="44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.726695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.726746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.726759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.726777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.726790 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.829128 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.829211 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.829229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.829255 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.829275 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.931729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.931794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.931812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.931834 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.931853 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:41Z","lastTransitionTime":"2026-01-27T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.974877 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/1.log" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.978093 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7"} Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.978281 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:07:41 crc kubenswrapper[4772]: I0127 15:07:41.991227 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.003023 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.042262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.042316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.042327 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.042342 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.042353 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.046133 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.061988 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.080900 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.093687 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.106434 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.124404 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.142244 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.144111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.144229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.144241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.144257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.144268 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.154874 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.166760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.183453 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.197588 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.210852 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.220008 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.240478 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.246256 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.246302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.246312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.246330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.246340 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.258120 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.348452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.348491 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.348502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.348519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.348530 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.450921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.450977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.450989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.451003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.451013 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.552813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.552847 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.552858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.552872 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.552885 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.641882 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:01:46.178645859 +0000 UTC Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.654830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.654858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.654866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.654880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.654890 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.662410 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.662495 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:42 crc kubenswrapper[4772]: E0127 15:07:42.662532 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.662410 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:42 crc kubenswrapper[4772]: E0127 15:07:42.662635 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:42 crc kubenswrapper[4772]: E0127 15:07:42.662719 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.756858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.756915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.756937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.756965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.756986 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.859370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.859425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.859436 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.859454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.859467 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.961789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.961831 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.961842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.961869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.961884 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:42Z","lastTransitionTime":"2026-01-27T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.981665 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/2.log" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.982328 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/1.log" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.984632 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7" exitCode=1 Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.984675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7"} Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.984723 4772 scope.go:117] "RemoveContainer" containerID="44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97" Jan 27 15:07:42 crc kubenswrapper[4772]: I0127 15:07:42.985377 4772 scope.go:117] "RemoveContainer" containerID="c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7" Jan 27 15:07:42 crc kubenswrapper[4772]: E0127 15:07:42.985512 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.004767 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.023381 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.036753 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.049680 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.059574 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.063861 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.063907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.063920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.063939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.063954 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.078603 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.090213 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.100785 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.132572 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.164031 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.165692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.165730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.165740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.165755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.165765 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.174757 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.187850 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.199274 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.210275 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.222420 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.247991 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.267915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.267975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.267984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.267999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.268027 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.269979 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.370288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.370330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.370341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.370358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.370370 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.474019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.474114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.474139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.474233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.474263 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.576576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.576629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.576646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.576668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.576684 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.642440 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 18:11:04.781888721 +0000 UTC Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.662453 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:43 crc kubenswrapper[4772]: E0127 15:07:43.662608 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.678577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.678637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.678654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.678678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.678696 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.782087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.782141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.782157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.782200 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.782219 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.884579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.884655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.884667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.884684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.884697 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.987239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.987283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.987294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.987311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.987322 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:43Z","lastTransitionTime":"2026-01-27T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:43 crc kubenswrapper[4772]: I0127 15:07:43.989453 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/2.log" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.090497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.090587 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.090613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.090640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.090662 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.193178 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.193228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.193263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.193283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.193293 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.295518 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.295569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.295585 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.295606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.295618 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.398236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.398275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.398283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.398297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.398306 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.500352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.500386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.500394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.500408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.500425 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.603556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.603605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.603622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.603639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.603650 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.643314 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:56:31.689912996 +0000 UTC Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.662671 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:44 crc kubenswrapper[4772]: E0127 15:07:44.662799 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.663309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.663368 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:44 crc kubenswrapper[4772]: E0127 15:07:44.663385 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:44 crc kubenswrapper[4772]: E0127 15:07:44.663510 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.674988 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.684906 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.693090 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.703052 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.705971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.705997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.706005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.706018 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.706026 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.713325 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.724902 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.735549 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.748096 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.764140 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44564ab0212c32423d179406999749f443cae1bd72ea8d12ff4411e23de77d97\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:28Z\\\",\\\"message\\\":\\\"twork-check-target template LB for network=default: []services.LB{}\\\\nI0127 15:07:28.751281 6196 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.775732 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.787732 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.801571 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.807746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.807782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.807793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.807807 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.807818 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.815858 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.827215 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.838893 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.848772 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.866145 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.909671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.909715 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.909730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.909745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:44 crc kubenswrapper[4772]: I0127 15:07:44.909757 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:44Z","lastTransitionTime":"2026-01-27T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.012447 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.012496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.012505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.012521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.012532 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.115212 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.115265 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.115281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.115302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.115320 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.218465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.218514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.218529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.218550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.218563 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.321231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.321278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.321288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.321311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.321333 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.424685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.424759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.424775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.424800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.424820 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.527107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.527139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.527148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.527160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.527216 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.629334 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.629384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.629395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.629413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.629430 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.644462 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 09:20:26.808390846 +0000 UTC Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.661951 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:45 crc kubenswrapper[4772]: E0127 15:07:45.662087 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.732216 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.732260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.732271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.732289 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.732302 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.835694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.835765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.835776 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.835794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.835805 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.937979 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.938047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.938063 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.938088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:45 crc kubenswrapper[4772]: I0127 15:07:45.938105 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:45Z","lastTransitionTime":"2026-01-27T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.040333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.040367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.040378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.040395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.040405 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.142691 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.142729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.142742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.142759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.142772 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.245768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.245805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.245813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.245827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.245855 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.290410 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.290606 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.290692 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:08:02.290667502 +0000 UTC m=+68.271276660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.348558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.348609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.348634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.348657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.348673 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.456440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.456484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.456495 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.456510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.456520 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.561105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.561153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.561182 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.561200 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.561214 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.592985 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.593126 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593157 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:08:18.593135207 +0000 UTC m=+84.573744305 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.593214 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593258 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593274 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593286 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593303 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593321 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:08:18.593311902 +0000 UTC m=+84.573921000 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593337 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:08:18.593328042 +0000 UTC m=+84.573937150 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.593260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.593367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593444 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593466 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593489 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593498 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593530 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:08:18.593509547 +0000 UTC m=+84.574118645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.593548 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:08:18.593542078 +0000 UTC m=+84.574151176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.645258 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 07:27:43.545897704 +0000 UTC Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.662607 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.662692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.662710 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.662772 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.662876 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.662995 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.664203 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.664234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.664245 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.664261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.664272 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.767064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.767103 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.767115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.767133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.767145 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.831857 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.833102 4772 scope.go:117] "RemoveContainer" containerID="c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7" Jan 27 15:07:46 crc kubenswrapper[4772]: E0127 15:07:46.833482 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.856491 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.870706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.870763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.870781 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.870808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.870831 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.873282 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.892590 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.916810 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.933517 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.949013 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.967445 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.973042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.973093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.973109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.973136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.973154 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:46Z","lastTransitionTime":"2026-01-27T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:46 crc kubenswrapper[4772]: I0127 15:07:46.983272 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:46Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.007867 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.023910 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.037230 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.049930 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.064015 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.075555 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.075595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.075606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.075625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.075638 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.075638 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.088748 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.101341 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.112367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.178301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.178402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.178429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.178468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.178498 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.281979 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.282043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.283806 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.283833 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.283844 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.386359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.386412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.386425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.386442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.386454 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.488434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.488472 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.488483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.488499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.488511 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.591649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.591720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.591745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.591777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.591800 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.645660 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 14:22:27.576648238 +0000 UTC Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.662016 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:47 crc kubenswrapper[4772]: E0127 15:07:47.662144 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.694092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.694193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.694219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.694244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.694262 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.741324 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.752264 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.754250 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.775158 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.790349 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.796592 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.796625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.796635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.796650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.796660 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.802050 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.816783 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.830022 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.843774 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.854956 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.873774 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.886530 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.896116 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.898892 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.898926 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.898939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.898954 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.898966 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:47Z","lastTransitionTime":"2026-01-27T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.906413 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.920404 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.936661 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.950081 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.962611 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:47 crc kubenswrapper[4772]: I0127 15:07:47.974198 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.001463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.001498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.001508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.001523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.001533 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.103526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.103567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.103576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.103591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.103600 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.206039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.206098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.206117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.206141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.206225 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.267284 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.280249 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.290820 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.307048 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.308260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.308294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.308308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.308324 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.308336 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.321342 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.333367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.345081 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.358700 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.371007 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.383222 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.393832 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.410444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.410491 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.410501 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.410519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.410532 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.415997 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.430155 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.444367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.457008 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.468734 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.481052 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.492653 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.502708 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.512636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.512693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.512708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.512728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.512740 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.614955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.615002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.615010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.615025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.615034 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.646230 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:19:41.122006038 +0000 UTC Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.663076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.663140 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.663097 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:48 crc kubenswrapper[4772]: E0127 15:07:48.663275 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:48 crc kubenswrapper[4772]: E0127 15:07:48.663370 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:48 crc kubenswrapper[4772]: E0127 15:07:48.663457 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.718483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.718524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.718533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.718549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.718560 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.821327 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.821375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.821391 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.821476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.821497 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.924014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.924066 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.924075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.924093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:48 crc kubenswrapper[4772]: I0127 15:07:48.924107 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:48Z","lastTransitionTime":"2026-01-27T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.027212 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.027255 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.027266 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.027284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.027295 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.130292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.130365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.130386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.130414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.130434 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.234335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.234671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.234691 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.234713 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.234728 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.337407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.337442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.337450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.337464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.337472 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.439653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.439707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.439716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.439732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.439745 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.543010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.543074 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.543097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.543127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.543148 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.646132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.646198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.646210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.646226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.646237 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.646358 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 10:56:58.117468312 +0000 UTC Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.662757 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:49 crc kubenswrapper[4772]: E0127 15:07:49.662903 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.748829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.748878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.748903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.748925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.748940 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.851651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.851698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.851711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.851726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.851738 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.953698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.953729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.953738 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.953752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:49 crc kubenswrapper[4772]: I0127 15:07:49.953760 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:49Z","lastTransitionTime":"2026-01-27T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.056285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.056364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.056380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.056399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.056411 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.159066 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.159110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.159122 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.159140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.159154 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.262327 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.262357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.262365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.262378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.262387 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.365380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.365434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.365451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.365483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.365507 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.468127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.468164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.468218 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.468234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.468246 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.575568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.575633 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.575645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.575677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.575698 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.647458 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 18:15:57.156439563 +0000 UTC Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.663535 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.663622 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.663734 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:50 crc kubenswrapper[4772]: E0127 15:07:50.663762 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:50 crc kubenswrapper[4772]: E0127 15:07:50.663933 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:50 crc kubenswrapper[4772]: E0127 15:07:50.663991 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.678826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.678886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.678897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.678919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.678931 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.782090 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.782148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.782160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.782227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.782241 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.884634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.884682 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.884694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.884713 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.884726 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.987229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.987274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.987285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.987300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:50 crc kubenswrapper[4772]: I0127 15:07:50.987309 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:50Z","lastTransitionTime":"2026-01-27T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.090021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.090070 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.090092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.090110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.090123 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.193683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.193731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.193742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.193758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.193769 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.296319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.296393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.296442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.296467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.296482 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.399110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.399183 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.399194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.399210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.399220 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.502851 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.502901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.502916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.502939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.502957 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.605142 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.605224 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.605236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.605257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.605271 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.610057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.610104 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.610128 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.610147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.610160 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.625143 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:51Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.630620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.630658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.630669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.630686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.630697 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.644408 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:51Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.647846 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 21:49:09.111859564 +0000 UTC Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.648561 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.648597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.648606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.648622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.648630 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.659323 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:51Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.661905 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.662005 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.663766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.663792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.663800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.663811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.663821 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.674509 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:51Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.677704 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.677753 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.677765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.677782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.677795 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.690000 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:51Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:51 crc kubenswrapper[4772]: E0127 15:07:51.690111 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.708302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.708333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.708344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.708361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.708371 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.810295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.810346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.810356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.810378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.810388 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.912767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.912833 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.912850 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.912875 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:51 crc kubenswrapper[4772]: I0127 15:07:51.912892 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:51Z","lastTransitionTime":"2026-01-27T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.014320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.014356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.014368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.014381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.014390 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.116533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.116601 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.116620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.116650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.116668 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.219671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.219728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.219746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.219769 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.219786 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.323241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.323304 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.323329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.323358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.323384 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.426234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.426295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.426313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.426340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.426359 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.529988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.530029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.530042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.530061 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.530074 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.632984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.633034 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.633050 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.633071 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.633086 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.648382 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 13:18:54.752362309 +0000 UTC Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.662805 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:52 crc kubenswrapper[4772]: E0127 15:07:52.663339 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.662817 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.663829 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:52 crc kubenswrapper[4772]: E0127 15:07:52.664108 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:52 crc kubenswrapper[4772]: E0127 15:07:52.667759 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.735994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.736047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.736058 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.736076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.736089 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.838604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.838639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.838647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.838662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.838671 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.941980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.942021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.942033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.942047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:52 crc kubenswrapper[4772]: I0127 15:07:52.942057 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:52Z","lastTransitionTime":"2026-01-27T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.044457 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.044509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.044540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.044564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.044581 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.147431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.147464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.147476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.147491 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.147502 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.250408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.250461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.250472 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.250489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.250499 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.352637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.352700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.352708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.352721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.352731 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.456245 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.456312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.456332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.456356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.456375 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.559341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.559388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.559399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.559419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.559434 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.648831 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 04:00:08.54394988 +0000 UTC Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.661977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.662038 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.662051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.662062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.662072 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.662083 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:53 crc kubenswrapper[4772]: E0127 15:07:53.662277 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.764789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.764831 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.764842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.764861 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.764874 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.867796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.867840 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.867856 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.867871 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.867881 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.971693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.971766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.971790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.971822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:53 crc kubenswrapper[4772]: I0127 15:07:53.971861 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:53Z","lastTransitionTime":"2026-01-27T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.074366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.074398 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.074406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.074420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.074428 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.176963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.177008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.177020 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.177036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.177050 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.279841 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.279894 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.279909 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.279928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.279941 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.382137 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.382233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.382249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.382272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.382288 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.484865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.484924 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.484932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.484947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.484956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.586863 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.586935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.586959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.586988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.587009 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.649021 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 20:29:46.546595802 +0000 UTC Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.661986 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.662005 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:54 crc kubenswrapper[4772]: E0127 15:07:54.662107 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.662176 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:54 crc kubenswrapper[4772]: E0127 15:07:54.662261 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:54 crc kubenswrapper[4772]: E0127 15:07:54.662370 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.676310 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.686650 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.689270 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.689290 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.689300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.689314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.689323 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.697503 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.707093 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.719140 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.731376 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.744540 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.756079 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.771260 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.783220 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.791257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.791321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.791336 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.791353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.792210 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.796707 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.806628 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.817836 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.831451 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.850031 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.860839 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.881367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.894137 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.894193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.894203 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.894220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.894232 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.894398 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:07:54Z is after 2025-08-24T17:21:41Z" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.997035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.997073 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.997083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.997100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:54 crc kubenswrapper[4772]: I0127 15:07:54.997112 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:54Z","lastTransitionTime":"2026-01-27T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.099061 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.099102 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.099117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.099140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.099153 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.201592 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.201625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.201635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.201650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.201662 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.305067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.305123 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.305139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.305222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.305246 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.407805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.407858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.407874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.407895 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.407908 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.510456 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.510516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.510538 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.510568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.510591 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.614269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.614710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.614879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.615038 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.615222 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.649498 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 02:51:52.967908726 +0000 UTC Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.662590 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:55 crc kubenswrapper[4772]: E0127 15:07:55.662840 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.717896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.717933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.717944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.717959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.717972 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.820365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.820434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.820458 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.820487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.820511 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.923599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.923643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.923653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.923669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:55 crc kubenswrapper[4772]: I0127 15:07:55.923682 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:55Z","lastTransitionTime":"2026-01-27T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.026499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.026867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.026982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.027056 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.027128 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.130729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.130766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.130777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.130793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.130803 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.233703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.233995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.234068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.234146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.234243 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.337452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.337493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.337501 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.337521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.337531 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.439945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.439984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.440001 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.440023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.440033 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.543431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.543492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.543504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.543528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.543543 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.646465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.646925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.646936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.646953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.646964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.649656 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:41:36.127466939 +0000 UTC Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.661948 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.662043 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.661970 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:56 crc kubenswrapper[4772]: E0127 15:07:56.662150 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:56 crc kubenswrapper[4772]: E0127 15:07:56.662272 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:56 crc kubenswrapper[4772]: E0127 15:07:56.662372 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.749809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.749887 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.749906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.749935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.749957 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.853086 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.853145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.853187 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.853214 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.853237 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.956316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.956354 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.956365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.956382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:56 crc kubenswrapper[4772]: I0127 15:07:56.956393 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:56Z","lastTransitionTime":"2026-01-27T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.099486 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.099572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.099594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.099611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.099625 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.202484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.202509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.202517 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.202530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.202539 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.305094 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.305158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.305193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.305207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.305217 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.407033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.407065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.407074 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.407087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.407097 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.509958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.510050 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.510062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.510077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.510088 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.613455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.613526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.613540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.613567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.613583 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.649897 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:40:31.847490531 +0000 UTC Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.662287 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:57 crc kubenswrapper[4772]: E0127 15:07:57.662406 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.716137 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.716208 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.716220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.716236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.716247 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.819747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.819817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.819842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.819871 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.819891 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.922941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.922969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.922979 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.922993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:57 crc kubenswrapper[4772]: I0127 15:07:57.923003 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:57Z","lastTransitionTime":"2026-01-27T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.025345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.025374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.025385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.025407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.025419 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.127251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.127293 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.127309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.127329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.127346 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.228913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.228953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.228964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.228983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.228994 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.330935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.331253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.331353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.331452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.331538 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.434619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.434654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.434664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.434678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.434688 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.537702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.537797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.537817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.537841 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.537859 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.640727 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.640794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.640813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.640832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.640846 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.650970 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:33:36.513932384 +0000 UTC Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.662405 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.662447 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.662480 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:07:58 crc kubenswrapper[4772]: E0127 15:07:58.662539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:07:58 crc kubenswrapper[4772]: E0127 15:07:58.662602 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:07:58 crc kubenswrapper[4772]: E0127 15:07:58.662712 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.743397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.743808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.743969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.744130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.744323 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.847105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.847379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.847447 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.847529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.847604 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.950121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.950189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.950206 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.950227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:58 crc kubenswrapper[4772]: I0127 15:07:58.950244 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:58Z","lastTransitionTime":"2026-01-27T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.051967 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.051991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.051999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.052010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.052019 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.155341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.155377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.155388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.155403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.155413 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.257437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.257469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.257478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.257491 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.257501 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.359750 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.359787 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.359797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.359811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.359822 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.462460 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.462502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.462513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.462529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.462542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.565134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.565227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.565244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.565274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.565289 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.651932 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:12:28.388235496 +0000 UTC Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.662346 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:07:59 crc kubenswrapper[4772]: E0127 15:07:59.662468 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.667232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.667266 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.667275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.667289 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.667299 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.769235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.769296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.769308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.769325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.769337 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.871667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.871716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.871730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.871748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.871762 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.973798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.973836 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.973862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.973879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:07:59 crc kubenswrapper[4772]: I0127 15:07:59.973889 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:07:59Z","lastTransitionTime":"2026-01-27T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.076699 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.076749 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.076757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.076771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.076779 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.179779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.179828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.179838 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.179854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.179863 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.282573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.282612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.282622 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.282640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.282650 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.384934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.384977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.384988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.385004 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.385015 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.488132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.488192 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.488201 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.488217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.488229 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.590615 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.590654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.590668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.590687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.590701 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.652806 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 00:34:46.230937063 +0000 UTC Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.664782 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:00 crc kubenswrapper[4772]: E0127 15:08:00.664898 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.665094 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:00 crc kubenswrapper[4772]: E0127 15:08:00.665157 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.665929 4772 scope.go:117] "RemoveContainer" containerID="c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7" Jan 27 15:08:00 crc kubenswrapper[4772]: E0127 15:08:00.666074 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.666244 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:00 crc kubenswrapper[4772]: E0127 15:08:00.666316 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.693728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.693767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.693778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.693794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.693804 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.799104 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.799141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.799149 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.799183 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.799194 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.901922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.901950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.901959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.901974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:00 crc kubenswrapper[4772]: I0127 15:08:00.901985 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:00Z","lastTransitionTime":"2026-01-27T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.004730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.004772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.004781 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.004801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.004813 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.107687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.107736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.107748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.107768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.107781 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.209792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.209854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.209868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.209885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.209896 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.312043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.312087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.312099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.312116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.312126 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.414696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.414731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.414743 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.414758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.414770 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.517606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.517648 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.517661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.517678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.517688 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.620692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.620721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.620731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.620745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.620755 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.653142 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 01:57:17.807469065 +0000 UTC Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.662787 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:01 crc kubenswrapper[4772]: E0127 15:08:01.662886 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.722729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.722764 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.722774 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.722792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.722802 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.825527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.825559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.825570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.825586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.825599 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.929504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.929546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.929557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.929574 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.929584 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.942377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.942429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.942444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.942462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.942474 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: E0127 15:08:01.954952 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:01Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.958931 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.958971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.958982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.958998 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.959009 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: E0127 15:08:01.969228 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:01Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.973404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.973446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.973459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.973483 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.973495 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: E0127 15:08:01.984507 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:01Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.988371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.988415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.988425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.988444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:01 crc kubenswrapper[4772]: I0127 15:08:01.988461 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:01Z","lastTransitionTime":"2026-01-27T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:01 crc kubenswrapper[4772]: E0127 15:08:01.999449 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:01Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.002285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.002310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.002320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.002333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.002341 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.013640 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.013743 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.032284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.032314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.032322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.032334 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.032343 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.135068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.135102 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.135110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.135123 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.135132 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.237273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.237311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.237323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.237339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.237351 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.339261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.339302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.339315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.339332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.339343 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.361678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.361799 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.361840 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:08:34.361825703 +0000 UTC m=+100.342434801 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.441947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.441994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.442011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.442030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.442039 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.544687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.544729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.544744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.544765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.544777 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.646341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.646397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.646407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.646423 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.646434 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.654186 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 00:21:49.964793684 +0000 UTC Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.662544 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.662574 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.662687 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.662734 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.662858 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:02 crc kubenswrapper[4772]: E0127 15:08:02.662963 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.748727 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.748772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.748782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.748796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.748807 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.851025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.851062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.851070 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.851093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.851103 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.958376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.958421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.958454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.958474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:02 crc kubenswrapper[4772]: I0127 15:08:02.958484 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:02Z","lastTransitionTime":"2026-01-27T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.069214 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.069272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.069283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.069301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.069312 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.171366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.171405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.171414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.171431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.171440 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.273501 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.273537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.273546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.273560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.273569 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.375488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.375566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.375578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.375596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.375606 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.477615 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.477677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.477686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.477736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.477748 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.580392 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.580439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.580453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.580471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.580484 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.654466 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 06:51:56.369202357 +0000 UTC Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.662801 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:03 crc kubenswrapper[4772]: E0127 15:08:03.662902 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.682583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.682608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.682616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.682627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.682636 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.785403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.785499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.785530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.785563 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.785590 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.888000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.888047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.888060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.888078 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.888091 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.990855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.990890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.990899 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.990913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:03 crc kubenswrapper[4772]: I0127 15:08:03.990923 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:03Z","lastTransitionTime":"2026-01-27T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.093439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.093471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.093478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.093493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.093502 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.195918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.195960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.195972 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.195988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.195999 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.298292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.298330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.298341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.298356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.298368 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.400757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.400814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.400829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.400846 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.400857 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.533219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.533259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.533270 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.533288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.533299 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.635853 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.635893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.635902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.635916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.635926 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.655092 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 23:00:43.81977253 +0000 UTC Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.662438 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.662512 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.662438 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:04 crc kubenswrapper[4772]: E0127 15:08:04.662615 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:04 crc kubenswrapper[4772]: E0127 15:08:04.662769 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:04 crc kubenswrapper[4772]: E0127 15:08:04.662875 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.683505 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.696558 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.704751 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.713209 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.722898 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.732812 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.741806 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.741834 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.741844 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.741858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.741869 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.744881 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.754232 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.762597 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.773975 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.783180 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.793290 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.802242 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.824957 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.835188 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.843614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.843667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.843680 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.843694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.843704 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.846633 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.873009 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.891748 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:04Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.946494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.946541 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.946564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.946584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:04 crc kubenswrapper[4772]: I0127 15:08:04.946598 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:04Z","lastTransitionTime":"2026-01-27T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.048787 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.048821 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.048830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.048842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.048850 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.051864 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/0.log" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.051905 4772 generic.go:334] "Generic (PLEG): container finished" podID="87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8" containerID="ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33" exitCode=1 Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.051932 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerDied","Data":"ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.052318 4772 scope.go:117] "RemoveContainer" containerID="ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.066858 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.079181 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.095926 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.109369 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.119964 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.131737 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.145087 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.150356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.150393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.150406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.150444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.150456 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.158601 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.171741 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.180361 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.196105 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.206633 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.215549 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.224356 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.233382 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.243713 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.252302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.252335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.252344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.252358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.252367 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.264277 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.274628 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:05Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.354309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.354355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.354367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.354382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.354392 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.456507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.456560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.456576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.456596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.456613 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.558607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.558644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.558655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.558670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.558679 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.655764 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 18:52:15.672424013 +0000 UTC Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.660778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.660845 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.660870 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.660907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.660957 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.661983 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:05 crc kubenswrapper[4772]: E0127 15:08:05.662106 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.763765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.763812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.763824 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.763842 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.763853 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.865706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.865755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.865766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.865781 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.865791 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.968933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.968972 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.968986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.969003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:05 crc kubenswrapper[4772]: I0127 15:08:05.969015 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:05Z","lastTransitionTime":"2026-01-27T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.056988 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/0.log" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.057038 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerStarted","Data":"9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.070188 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.071203 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.071258 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.071275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.071297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.071314 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.087291 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.100636 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.111898 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.124082 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.138550 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.151107 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.161817 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.171231 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.174069 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.174096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.174107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.174121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.174132 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.189914 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.219861 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.238813 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.255693 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.268668 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.276496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.276560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.276575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.276594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.276618 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.281364 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.295959 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.307569 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.319493 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:06Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.386215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.386253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.386263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.386279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.386291 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.488372 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.488401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.488409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.488439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.488448 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.590942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.590981 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.590993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.591009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.591020 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.656281 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 00:39:23.673343576 +0000 UTC Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.662664 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.662666 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:06 crc kubenswrapper[4772]: E0127 15:08:06.662775 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:06 crc kubenswrapper[4772]: E0127 15:08:06.662833 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.662669 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:06 crc kubenswrapper[4772]: E0127 15:08:06.662916 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.693522 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.693576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.693589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.693607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.693628 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.796502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.796559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.796567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.796585 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.796595 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.899200 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.899244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.899253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.899269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:06 crc kubenswrapper[4772]: I0127 15:08:06.899279 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:06Z","lastTransitionTime":"2026-01-27T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.002430 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.002477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.002487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.002506 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.002522 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.104499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.104544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.104557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.104575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.104587 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.207990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.208059 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.208072 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.208109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.208122 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.310642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.310686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.310696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.310710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.310718 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.413273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.413332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.413343 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.413358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.413367 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.516109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.516158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.516194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.516213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.516225 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.619091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.619130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.619139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.619155 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.619180 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.657435 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 14:12:23.172922165 +0000 UTC Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.662730 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:07 crc kubenswrapper[4772]: E0127 15:08:07.662839 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.721211 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.721243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.721251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.721263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.721272 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.823657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.823711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.823720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.823736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.823746 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.925889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.925925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.925934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.925950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:07 crc kubenswrapper[4772]: I0127 15:08:07.925959 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:07Z","lastTransitionTime":"2026-01-27T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.027942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.027994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.028004 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.028018 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.028028 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.130377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.130415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.130429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.130446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.130458 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.231969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.231999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.232006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.232019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.232027 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.334152 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.334226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.334241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.334260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.334275 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.436847 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.436895 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.436906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.436959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.436970 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.539084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.539129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.539160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.539194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.539205 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.641924 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.641966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.641980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.641998 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.642009 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.658489 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 09:57:10.470768523 +0000 UTC Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.662803 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.662909 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:08 crc kubenswrapper[4772]: E0127 15:08:08.663036 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.663094 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:08 crc kubenswrapper[4772]: E0127 15:08:08.663105 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:08 crc kubenswrapper[4772]: E0127 15:08:08.663331 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.744885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.744928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.744938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.744955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.744965 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.847697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.847745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.847757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.847774 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.847786 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.949893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.949934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.949947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.949965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:08 crc kubenswrapper[4772]: I0127 15:08:08.949976 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:08Z","lastTransitionTime":"2026-01-27T15:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.052271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.052316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.052326 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.052340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.052349 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.154571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.154603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.154613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.154627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.154637 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.257239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.257281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.257292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.257307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.257316 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.359471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.359536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.359551 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.359568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.359579 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.462354 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.462394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.462403 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.462417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.462427 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.565612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.565678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.565696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.565726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.565748 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.658750 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:32:29.990523342 +0000 UTC Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.661857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:09 crc kubenswrapper[4772]: E0127 15:08:09.661978 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.668441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.668475 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.668486 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.668500 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.668508 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.772022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.772080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.772091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.772107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.772118 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.874928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.875247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.875259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.875277 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.875287 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.977559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.977593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.977601 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.977618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:09 crc kubenswrapper[4772]: I0127 15:08:09.977626 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:09Z","lastTransitionTime":"2026-01-27T15:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.080127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.080153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.080187 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.080201 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.080209 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.183250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.183311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.183341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.183371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.183389 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.290409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.290453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.290464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.290480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.290491 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.393364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.393411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.393423 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.393439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.393453 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.495945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.496002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.496015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.496033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.496045 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.598445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.598497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.598512 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.598531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.598542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.659359 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 05:33:31.392985993 +0000 UTC Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.662667 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.662787 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:10 crc kubenswrapper[4772]: E0127 15:08:10.662795 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:10 crc kubenswrapper[4772]: E0127 15:08:10.662868 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.662667 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:10 crc kubenswrapper[4772]: E0127 15:08:10.662939 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.700696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.700733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.700741 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.700755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.700764 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.802587 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.802619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.802628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.802641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.802650 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.904953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.905003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.905014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.905029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:10 crc kubenswrapper[4772]: I0127 15:08:10.905039 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:10Z","lastTransitionTime":"2026-01-27T15:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.008502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.008561 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.008579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.008602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.008619 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.111546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.111619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.111642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.111670 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.111691 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.214550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.214596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.214619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.214639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.214652 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.318121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.318216 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.318238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.318261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.318273 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.421468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.421511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.421522 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.421540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.421552 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.524193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.524237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.524247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.524262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.524272 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.627603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.627982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.628133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.628195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.628215 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.659899 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 03:32:48.024226201 +0000 UTC Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.662218 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:11 crc kubenswrapper[4772]: E0127 15:08:11.662330 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.730404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.730444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.730452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.730464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.730473 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.833607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.833650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.833661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.833675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.833685 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.936596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.936664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.936676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.936695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:11 crc kubenswrapper[4772]: I0127 15:08:11.936708 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:11Z","lastTransitionTime":"2026-01-27T15:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.039630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.039671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.039685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.039705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.039720 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.062205 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.062252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.062266 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.062282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.062291 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.073118 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.077854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.077938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.077962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.078017 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.078039 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.097286 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.099957 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.099983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.099999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.100020 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.100031 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.110096 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.113903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.113931 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.113956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.113969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.113977 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.125601 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.128452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.128486 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.128495 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.128507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.128516 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.139943 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.140109 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.141983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.142237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.142263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.142286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.142308 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.247407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.247467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.247484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.247505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.247899 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.350757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.350784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.350795 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.350807 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.350815 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.454108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.454136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.454145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.454185 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.454199 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.556526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.556564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.556573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.556587 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.556596 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.659978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.660035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.660047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.660007 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 05:14:26.187134993 +0000 UTC Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.660064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.660080 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.662454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.662493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.662509 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.662676 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.662794 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:12 crc kubenswrapper[4772]: E0127 15:08:12.662855 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.663404 4772 scope.go:117] "RemoveContainer" containerID="c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.763513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.763564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.763576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.763599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.763613 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.866285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.866323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.866354 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.866370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.866381 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.968553 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.968592 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.968602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.968616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:12 crc kubenswrapper[4772]: I0127 15:08:12.968625 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:12Z","lastTransitionTime":"2026-01-27T15:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.072999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.073029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.073038 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.073051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.073060 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.079149 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/2.log" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.082907 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.083364 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.099739 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.117351 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.128086 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.145739 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.164511 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.174829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.174854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.174863 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.174876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.174885 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.183820 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.193470 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.202225 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.211649 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.222916 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.234760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.245036 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.258608 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.268620 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.276668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.276740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.276752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.276768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.276800 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.288524 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.301267 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.312077 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.326107 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.379709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.379747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.379758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.379775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.379786 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.484989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.485047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.485058 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.485076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.485091 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.587644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.587700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.587716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.587739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.587753 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.660109 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 07:10:11.341317044 +0000 UTC Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.662390 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:13 crc kubenswrapper[4772]: E0127 15:08:13.662522 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.690869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.690942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.690962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.690984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.690999 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.793991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.794049 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.794060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.794079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.794090 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.896788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.896822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.896830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.896843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.896859 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.999236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.999273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.999283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.999299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:13 crc kubenswrapper[4772]: I0127 15:08:13.999311 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:13Z","lastTransitionTime":"2026-01-27T15:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.087008 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/3.log" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.087580 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/2.log" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.089869 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" exitCode=1 Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.089903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.089936 4772 scope.go:117] "RemoveContainer" containerID="c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.090621 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:08:14 crc kubenswrapper[4772]: E0127 15:08:14.090801 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.102739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.102804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.102816 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.102835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.102845 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.112590 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.126585 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.139370 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.150807 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.161216 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.178722 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.190056 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.201708 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.205048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.205084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.205093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.205108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.205120 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.213831 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.230368 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:13Z\\\",\\\"message\\\":\\\"-secret-name:openshift-controller-manager-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006e9cc6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-controller-manager-operator,},ClusterIP:10.217.5.58,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.58],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0127 15:08:13.535689 6829 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.242711 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.254760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.269921 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.281784 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.293758 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.307274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.307304 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.307312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.307324 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.307335 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.309699 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.319797 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.338547 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.409478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.409507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.409515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.409527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.409535 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.512628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.512685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.512703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.512726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.512744 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.614671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.614711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.614720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.614734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.614744 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.661225 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:12:51.564234114 +0000 UTC Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.662498 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:14 crc kubenswrapper[4772]: E0127 15:08:14.662661 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.662679 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.662745 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:14 crc kubenswrapper[4772]: E0127 15:08:14.662852 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:14 crc kubenswrapper[4772]: E0127 15:08:14.662924 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.677131 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.687616 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.702798 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.717281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.717279 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.717333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.717358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.717373 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.717382 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.738116 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.752406 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.767900 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.781475 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.805246 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c77c5fad0f4c478526015a7e2dcc6a18a4a586ead55b8bb16ff40d61ca66f4a7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:07:42Z\\\",\\\"message\\\":\\\"g/owner\\\\\\\":\\\\\\\"openshift-ingress/router-internal-default\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.176\\\\\\\", Port:1936, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0127 15:07:42.483073 6419 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:13Z\\\",\\\"message\\\":\\\"-secret-name:openshift-controller-manager-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006e9cc6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-controller-manager-operator,},ClusterIP:10.217.5.58,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.58],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0127 15:08:13.535689 6829 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:08:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.819884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.819922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.819936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.819953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.819964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.820047 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.831281 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.846062 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.862448 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.877232 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.891241 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.900985 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.920013 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.922249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.922291 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.922300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.922316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.922326 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:14Z","lastTransitionTime":"2026-01-27T15:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:14 crc kubenswrapper[4772]: I0127 15:08:14.932660 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.023795 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.023835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.023846 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.023863 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.023875 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.094235 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/3.log" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.097153 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:08:15 crc kubenswrapper[4772]: E0127 15:08:15.097440 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.108034 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.118100 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.126500 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.126572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.126595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.126626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.126649 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.141920 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.154728 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.172804 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.186871 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.195265 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.204892 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.216193 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.226740 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.233690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.233734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.233747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.233762 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.233774 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.242498 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.253517 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.261390 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.271593 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.281141 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.292416 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.301807 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.317630 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:13Z\\\",\\\"message\\\":\\\"-secret-name:openshift-controller-manager-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006e9cc6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-controller-manager-operator,},ClusterIP:10.217.5.58,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.58],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0127 15:08:13.535689 6829 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:08:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:15Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.336142 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.336195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.336214 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.336230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.336334 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.439264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.439318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.439342 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.439364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.439379 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.542088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.542131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.542143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.542160 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.542193 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.645141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.645198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.645207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.645221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.645229 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.662344 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 08:47:23.925426665 +0000 UTC Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.662415 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:15 crc kubenswrapper[4772]: E0127 15:08:15.662548 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.748643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.748682 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.748690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.748703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.748713 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.851649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.851684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.851692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.851705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.851714 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.953931 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.953962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.953970 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.953983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:15 crc kubenswrapper[4772]: I0127 15:08:15.953992 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:15Z","lastTransitionTime":"2026-01-27T15:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.056376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.056402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.056410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.056428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.056446 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.159490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.159580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.159616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.159648 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.159672 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.262991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.263030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.263073 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.263089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.263098 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.365582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.365623 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.365632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.365646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.365656 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.467993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.468053 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.468067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.468084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.468095 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.571637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.571681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.571691 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.571708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.571721 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.662853 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.662867 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 08:56:51.946547702 +0000 UTC Jan 27 15:08:16 crc kubenswrapper[4772]: E0127 15:08:16.662985 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.662853 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.663058 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:16 crc kubenswrapper[4772]: E0127 15:08:16.663127 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:16 crc kubenswrapper[4772]: E0127 15:08:16.663224 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.674003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.674037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.674048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.674063 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.674077 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.776409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.776469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.776486 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.776512 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.776529 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.879453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.879525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.879537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.879556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.879569 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.981980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.982027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.982039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.982054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:16 crc kubenswrapper[4772]: I0127 15:08:16.982063 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:16Z","lastTransitionTime":"2026-01-27T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.084227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.084568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.084578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.084592 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.084601 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.186441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.186484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.186494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.186510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.186521 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.288910 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.288985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.289003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.289028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.289047 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.392508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.392548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.392566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.392581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.392591 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.496198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.496239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.496252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.496274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.496292 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.598429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.598792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.598911 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.599053 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.599210 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.662628 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:17 crc kubenswrapper[4772]: E0127 15:08:17.662780 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.662967 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 18:02:34.461479057 +0000 UTC Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.701472 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.701544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.701567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.701593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.701613 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.804149 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.804206 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.804219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.804235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.804247 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.907476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.907526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.907543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.907564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:17 crc kubenswrapper[4772]: I0127 15:08:17.907577 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:17Z","lastTransitionTime":"2026-01-27T15:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.009750 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.009783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.009793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.009805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.009813 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.112697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.112732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.112744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.112762 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.112774 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.215424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.215471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.215486 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.215503 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.215515 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.318647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.318720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.318729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.318746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.318755 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.421355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.421401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.421418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.421441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.421456 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.524289 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.524337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.524348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.524366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.524378 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.628061 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.628130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.628159 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.628232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.628257 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.644508 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.644631 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.644657 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.644691 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644732 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.644692111 +0000 UTC m=+148.625301209 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644752 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.644807 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644931 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644971 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644991 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644931 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644967 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.644916478 +0000 UTC m=+148.625525776 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.645079 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.645052192 +0000 UTC m=+148.625661400 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.645086 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.645114 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.644811 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.645199 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.645179195 +0000 UTC m=+148.625788293 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.645234 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.645226127 +0000 UTC m=+148.625835225 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.662723 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.662760 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.662879 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.662942 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.663063 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 15:36:33.756985932 +0000 UTC Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.663131 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:18 crc kubenswrapper[4772]: E0127 15:08:18.663232 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.731281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.731340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.731353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.731370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.731385 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.834189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.834229 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.834238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.834252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.834263 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.936697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.936735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.936746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.936763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:18 crc kubenswrapper[4772]: I0127 15:08:18.936772 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:18Z","lastTransitionTime":"2026-01-27T15:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.039492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.039529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.039540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.039557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.039567 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.142059 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.142100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.142108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.142125 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.142135 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.245485 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.245546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.245569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.245591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.245602 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.347876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.347927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.347938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.347953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.347962 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.450656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.450693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.450703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.450717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.450726 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.553469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.553556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.553572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.553589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.553624 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.656369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.656426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.656442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.656466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.656482 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.662659 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:19 crc kubenswrapper[4772]: E0127 15:08:19.662836 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.663147 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 17:47:22.8914139 +0000 UTC Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.758328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.758389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.758406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.758428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.758446 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.862030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.862077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.862089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.862106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.862117 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.964886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.964939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.964955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.964972 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:19 crc kubenswrapper[4772]: I0127 15:08:19.964984 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:19Z","lastTransitionTime":"2026-01-27T15:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.067688 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.067733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.067742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.067758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.067768 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.170292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.170367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.170388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.170416 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.170437 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.273246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.273294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.273308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.273338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.273350 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.376547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.376609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.376626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.376651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.376668 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.479676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.479721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.479738 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.479756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.479766 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.582969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.583015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.583026 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.583046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.583064 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.662351 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.662475 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.662483 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:20 crc kubenswrapper[4772]: E0127 15:08:20.662608 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:20 crc kubenswrapper[4772]: E0127 15:08:20.662709 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:20 crc kubenswrapper[4772]: E0127 15:08:20.662843 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.663518 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 02:58:51.448823276 +0000 UTC Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.685748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.685792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.685800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.685815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.685825 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.787965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.788006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.788014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.788027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.788036 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.891380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.891434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.891450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.891471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.891485 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.995517 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.995562 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.995574 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.995591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:20 crc kubenswrapper[4772]: I0127 15:08:20.995606 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:20Z","lastTransitionTime":"2026-01-27T15:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.098953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.098990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.099017 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.099036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.099069 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.201532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.201595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.201610 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.201627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.201639 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.305588 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.305656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.305692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.305734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.305758 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.408297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.408349 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.408357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.408373 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.408383 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.511696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.511754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.511771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.511796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.511813 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.614921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.614967 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.614976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.614994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.615006 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.662696 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:21 crc kubenswrapper[4772]: E0127 15:08:21.663515 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.663653 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:12:42.761807844 +0000 UTC Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.716925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.716969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.716981 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.716997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.717007 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.820625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.820683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.820696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.820715 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.820730 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.923364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.923412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.923425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.923446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:21 crc kubenswrapper[4772]: I0127 15:08:21.923473 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:21Z","lastTransitionTime":"2026-01-27T15:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.026277 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.026399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.026418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.026444 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.026464 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.128640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.128681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.128693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.128709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.128721 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.231467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.231536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.231554 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.231579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.231595 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.334591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.334643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.334658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.334679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.334693 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.438138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.438268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.438300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.438322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.438339 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.449746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.449783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.449791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.449809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.449819 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.464150 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.470683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.470747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.470766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.470790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.470808 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.493022 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.498230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.498311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.498336 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.498370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.498393 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.520436 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.526017 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.526088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.526112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.526143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.526222 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.540004 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.544155 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.544281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.544298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.544320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.544336 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.562919 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.563134 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.565243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.565308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.565328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.565350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.565366 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.662360 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.662467 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.662607 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.662636 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.662713 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:22 crc kubenswrapper[4772]: E0127 15:08:22.662842 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.664853 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:58:03.60725289 +0000 UTC Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.668640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.668690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.668709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.668729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.668744 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.771709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.771785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.771806 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.771833 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.771854 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.874612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.874650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.874659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.874677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.874688 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.977158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.977224 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.977235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.977250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:22 crc kubenswrapper[4772]: I0127 15:08:22.977261 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:22Z","lastTransitionTime":"2026-01-27T15:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.080153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.080230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.080239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.080252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.080282 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.183617 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.183653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.183664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.183699 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.183709 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.285965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.285999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.286007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.286021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.286030 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.388646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.388742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.388768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.388798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.388819 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.491253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.491287 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.491298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.491315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.491327 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.593577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.593603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.593611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.593623 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.593631 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.662977 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:23 crc kubenswrapper[4772]: E0127 15:08:23.663223 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.665186 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:33:29.670525141 +0000 UTC Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.695981 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.696004 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.696012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.696024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.696033 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.798775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.798832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.798849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.798872 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.798889 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.902368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.902409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.902419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.902434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:23 crc kubenswrapper[4772]: I0127 15:08:23.902444 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:23Z","lastTransitionTime":"2026-01-27T15:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.003985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.004023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.004039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.004055 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.004065 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.106925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.106976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.106988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.107007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.107021 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.208915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.208982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.208999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.209028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.209087 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.311437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.311487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.311499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.311516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.311529 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.413854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.413889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.413901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.413917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.413928 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.516193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.516236 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.516248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.516265 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.516277 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.618722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.618767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.618783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.618798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.618810 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.662524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.662565 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.662529 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:24 crc kubenswrapper[4772]: E0127 15:08:24.662674 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:24 crc kubenswrapper[4772]: E0127 15:08:24.662790 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:24 crc kubenswrapper[4772]: E0127 15:08:24.662884 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.665555 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 21:24:58.031957623 +0000 UTC Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.677716 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.690356 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.702493 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.715630 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.720851 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.720896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.720907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.720929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.720941 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.743526 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.760966 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.774387 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.787088 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.800352 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.816627 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.822925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.822965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.822976 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.822993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.823004 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.829561 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.840874 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.854630 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.876361 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:13Z\\\",\\\"message\\\":\\\"-secret-name:openshift-controller-manager-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006e9cc6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-controller-manager-operator,},ClusterIP:10.217.5.58,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.58],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0127 15:08:13.535689 6829 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:08:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.892104 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.902443 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.913490 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.923802 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.925088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.925138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.925150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.925191 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:24 crc kubenswrapper[4772]: I0127 15:08:24.925204 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:24Z","lastTransitionTime":"2026-01-27T15:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.027715 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.027768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.027778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.027794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.027804 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.129843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.129908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.129919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.129933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.129945 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.232808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.232896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.232914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.232939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.232958 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.335428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.335469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.335481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.335499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.335511 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.438570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.438704 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.438717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.438736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.438751 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.541144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.541196 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.541204 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.541219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.541229 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.643237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.643302 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.643311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.643324 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.643333 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.662120 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:25 crc kubenswrapper[4772]: E0127 15:08:25.662369 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.663188 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:08:25 crc kubenswrapper[4772]: E0127 15:08:25.663337 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.666560 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 02:55:52.778190418 +0000 UTC Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.672967 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.745664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.745720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.745770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.745789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.745799 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.847676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.847712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.847720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.847735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.847745 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.950215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.950252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.950262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.950279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:25 crc kubenswrapper[4772]: I0127 15:08:25.950289 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:25Z","lastTransitionTime":"2026-01-27T15:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.052408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.052441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.052449 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.052463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.052472 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.155534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.155575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.155586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.155602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.155612 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.258317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.258381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.258393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.258407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.258418 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.361547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.361616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.361633 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.361658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.361674 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.464107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.464156 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.464182 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.464200 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.464212 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.566578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.566618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.566628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.566642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.566652 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.662280 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.662359 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:26 crc kubenswrapper[4772]: E0127 15:08:26.662464 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.662307 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:26 crc kubenswrapper[4772]: E0127 15:08:26.662560 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:26 crc kubenswrapper[4772]: E0127 15:08:26.662629 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.667003 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:41:01.673331451 +0000 UTC Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.668954 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.669006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.669027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.669132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.669257 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.771890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.771933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.771945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.771962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.771976 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.875022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.875064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.875075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.875093 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.875107 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.977533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.977571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.977580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.977594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:26 crc kubenswrapper[4772]: I0127 15:08:26.977603 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:26Z","lastTransitionTime":"2026-01-27T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.079529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.079567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.079578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.079595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.079607 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.181937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.181982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.181993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.182011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.182023 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.285479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.285526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.285539 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.285560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.285571 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.388729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.388780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.388798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.388819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.388838 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.491368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.491401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.491410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.491423 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.491433 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.594374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.594439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.594463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.594497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.594520 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.662387 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:27 crc kubenswrapper[4772]: E0127 15:08:27.662574 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.667761 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 06:25:56.007323477 +0000 UTC Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.697632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.697686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.697704 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.697729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.697747 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.800574 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.800644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.800668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.800698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.800720 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.904084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.904153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.904215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.904244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:27 crc kubenswrapper[4772]: I0127 15:08:27.904265 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:27Z","lastTransitionTime":"2026-01-27T15:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.007767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.007843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.007866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.007899 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.007923 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.111094 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.111151 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.111162 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.111239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.111256 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.213979 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.214031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.214042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.214060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.214071 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.317550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.317632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.317656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.317686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.317711 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.420808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.420889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.420907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.420936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.420955 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.525339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.525419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.525437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.525465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.525483 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.628995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.629064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.629084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.629117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.629136 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.663065 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.663280 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.663373 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:28 crc kubenswrapper[4772]: E0127 15:08:28.663498 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:28 crc kubenswrapper[4772]: E0127 15:08:28.663576 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:28 crc kubenswrapper[4772]: E0127 15:08:28.663742 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.668607 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 10:23:54.613707222 +0000 UTC Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.731532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.731571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.731584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.731603 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.731617 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.834754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.834798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.834810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.834828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.834840 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.936994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.937044 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.937052 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.937065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:28 crc kubenswrapper[4772]: I0127 15:08:28.937074 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:28Z","lastTransitionTime":"2026-01-27T15:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.040084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.040157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.040226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.040262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.040287 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.141855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.141896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.141907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.141922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.141932 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.244103 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.244150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.244159 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.244276 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.244295 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.346805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.346847 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.346858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.346876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.346888 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.449532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.449575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.449591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.449605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.449616 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.552473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.552547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.552568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.552597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.552620 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.654973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.655017 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.655026 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.655040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.655049 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.662435 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:29 crc kubenswrapper[4772]: E0127 15:08:29.662591 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.668875 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 01:04:24.947734451 +0000 UTC Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.757259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.757310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.757333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.757355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.757370 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.859850 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.859890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.859916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.859943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.859959 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.963259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.963294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.963303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.963318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:29 crc kubenswrapper[4772]: I0127 15:08:29.963326 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:29Z","lastTransitionTime":"2026-01-27T15:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.065879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.065917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.065934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.065951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.065962 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.167887 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.167925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.167937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.167953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.167966 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.271740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.271822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.271847 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.271878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.271902 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.375346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.375427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.375445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.375469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.375486 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.478345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.478395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.478417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.478445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.478464 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.580888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.581239 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.581252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.581269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.581281 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.662782 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.662840 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.662782 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:30 crc kubenswrapper[4772]: E0127 15:08:30.662917 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:30 crc kubenswrapper[4772]: E0127 15:08:30.663192 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:30 crc kubenswrapper[4772]: E0127 15:08:30.663238 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.669150 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 11:12:21.075251844 +0000 UTC Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.684074 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.684113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.684126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.684143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.684156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.786890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.786985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.786995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.787008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.787018 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.889808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.889858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.889874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.889895 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.889908 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.991918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.991965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.991975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.991990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:30 crc kubenswrapper[4772]: I0127 15:08:30.992000 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:30Z","lastTransitionTime":"2026-01-27T15:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.095866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.095921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.095929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.095945 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.095955 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.198790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.198849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.198868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.198893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.198910 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.301249 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.301278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.301286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.301299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.301308 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.403884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.403937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.403953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.403973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.403991 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.506228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.506325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.506344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.506360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.506371 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.608956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.609047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.609067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.609092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.609134 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.662852 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:31 crc kubenswrapper[4772]: E0127 15:08:31.663003 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.670268 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 14:04:13.722393681 +0000 UTC Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.711420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.711473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.711488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.711508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.711519 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.813578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.813641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.813663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.813690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.813711 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.916215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.916263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.916278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.916295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:31 crc kubenswrapper[4772]: I0127 15:08:31.916307 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:31Z","lastTransitionTime":"2026-01-27T15:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.019069 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.019111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.019124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.019144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.019155 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.121629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.121714 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.121728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.121750 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.121764 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.223795 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.223865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.223888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.223919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.223942 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.327076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.327123 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.327134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.327155 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.327192 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.430991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.431356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.431442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.431473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.431488 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.533938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.534006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.534025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.534047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.534067 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.637002 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.637139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.637157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.637190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.637202 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.662543 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:32 crc kubenswrapper[4772]: E0127 15:08:32.662777 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.662869 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.662574 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:32 crc kubenswrapper[4772]: E0127 15:08:32.663028 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:32 crc kubenswrapper[4772]: E0127 15:08:32.663146 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.670458 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 17:34:18.814230479 +0000 UTC Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.740238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.740277 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.740288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.740303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.740314 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.843576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.843662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.843686 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.843716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.843738 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.946243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.946318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.946341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.946371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.946395 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.957768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.957817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.957828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.957846 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.957858 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: E0127 15:08:32.976944 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.981252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.981282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.981292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.981306 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.981315 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:32 crc kubenswrapper[4772]: E0127 15:08:32.991985 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.995032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.995067 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.995077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.995092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:32 crc kubenswrapper[4772]: I0127 15:08:32.995101 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:32Z","lastTransitionTime":"2026-01-27T15:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: E0127 15:08:33.005134 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.007749 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.007779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.007788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.007800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.007810 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: E0127 15:08:33.018752 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.021231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.021268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.021279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.021290 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.021299 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: E0127 15:08:33.033204 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:33 crc kubenswrapper[4772]: E0127 15:08:33.033313 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.048675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.048715 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.048725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.048738 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.048746 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.150830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.150865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.150877 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.150891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.150902 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.253754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.253808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.253820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.253839 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.253853 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.356620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.356683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.356697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.356716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.356732 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.459968 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.460032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.460048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.460068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.460080 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.562124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.562230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.562248 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.562273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.562289 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.662562 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:33 crc kubenswrapper[4772]: E0127 15:08:33.662766 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.664589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.664650 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.664660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.664694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.664706 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.671116 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 09:21:29.382700586 +0000 UTC Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.766992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.767019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.767027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.767040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.767049 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.870594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.870649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.870666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.870687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.870704 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.973727 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.973766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.973776 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.973792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:33 crc kubenswrapper[4772]: I0127 15:08:33.973802 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:33Z","lastTransitionTime":"2026-01-27T15:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.077094 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.077147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.077157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.077190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.077201 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.180458 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.180512 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.180528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.180546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.180560 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.283897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.283947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.283961 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.283984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.284001 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.387399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.387442 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.387455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.387473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.387485 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.416022 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:34 crc kubenswrapper[4772]: E0127 15:08:34.416245 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:08:34 crc kubenswrapper[4772]: E0127 15:08:34.416340 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs podName:371016c8-5a23-427d-aa0a-0faa241d86a7 nodeName:}" failed. No retries permitted until 2026-01-27 15:09:38.416318497 +0000 UTC m=+164.396927645 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs") pod "network-metrics-daemon-ql2vx" (UID: "371016c8-5a23-427d-aa0a-0faa241d86a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.490059 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.490106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.490117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.490135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.490147 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.592537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.592563 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.592571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.592583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.592591 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.662756 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.662887 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:34 crc kubenswrapper[4772]: E0127 15:08:34.662931 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.662993 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:34 crc kubenswrapper[4772]: E0127 15:08:34.663090 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:34 crc kubenswrapper[4772]: E0127 15:08:34.663162 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.671577 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:02:18.202682362 +0000 UTC Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.681239 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a2b20172f77b5d595f6543c954936ade12fd0cf0625b1abe17cb400adfd8842\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ab1226587c192dfc7094511fb2d0ce13cd3e47e84a683ec1a3a175f2496c015\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.693364 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67794a44-d793-4fd7-9e54-e40437f67c0b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b80a5eaeeb793907d34b34a1bf5727c3da1dd01beb45fd8ebdc224b650b9f9aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh6ph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4hwxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.695231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.695429 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.695447 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.695465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.695476 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.713158 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"736264c8-cd18-479a-88ba-e1ec15dbfdae\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:13Z\\\",\\\"message\\\":\\\"-secret-name:openshift-controller-manager-operator-serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc006e9cc6b \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{0 8443 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app: openshift-controller-manager-operator,},ClusterIP:10.217.5.58,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.58],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0127 15:08:13.535689 6829 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:08:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2dt6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n2khk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.728315 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11f71341-1cdc-430d-8d90-a87af2a493f1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0127 15:07:09.417986 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:07:09.418905 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2455853299/tls.crt::/tmp/serving-cert-2455853299/tls.key\\\\\\\"\\\\nI0127 15:07:14.676378 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:07:14.679920 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:07:14.679946 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:07:14.679972 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:07:14.679980 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:07:14.686726 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:07:14.686753 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686793 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:07:14.686798 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:07:14.686801 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:07:14.686805 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:07:14.686807 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0127 15:07:14.686800 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0127 15:07:14.690409 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.740394 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6cfdf02-101c-4f18-9ebe-16002352afce\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5b9360189745deea4c8950c4f3b73762c7cb098452c1f31c3df9ab99fc31ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3be141392fbb6ae8ce51085cfc186745bb5ac272899e3202aaed2b7093855c4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c8aeb62c273e8cb31f01742805bd40c4184821c4bbd94f319ff5f074265d75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64098f88e61c989a4f2048d222906eacf8c0525f26e109913c1718c9dfb67d20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.755308 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.769134 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1acef947-6310-4ac0-bc84-a06d91f84cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://790f07e4c1ef52bf6e541034bdd5cc70277cdd5522fd74919677e8dc97f13490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9580aa20d3a39dcbe87c6eb9b7f294c101a80c6a360abe4caf6e47270bc538\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e668395335c1e28bff6af3da4358538cbc78d8a8837cc1bf1fa58053d0a792d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4981ab36202d45fb2e3b5474b83bb97a70ee85761ed4595969d3629a9a14c7cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e7feb8db10ed98640d39d316ec7bee08cf060c2e5056803a4a7b6afe5bc9dbd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c31746847c1db9cc19a3f43cedd3a346b1991a96cca59a07c709e8a57f546e57\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9ee51cff497be766cad5cd7ead7b3e51e5e98af36d4af1a9d0da91a55a079d8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntsst\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-c7pdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.781604 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x7jwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:08:04Z\\\",\\\"message\\\":\\\"2026-01-27T15:07:18+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570\\\\n2026-01-27T15:07:18+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_85dd4958-953b-469d-b48e-3b998cab8570 to /host/opt/cni/bin/\\\\n2026-01-27T15:07:19Z [verbose] multus-daemon started\\\\n2026-01-27T15:07:19Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:08:04Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:08:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8d4sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:17Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x7jwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.793028 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7947125a-23ff-4bc4-9f9a-743173e3bf96\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29ec44d988753e2d56359eaeae0782085a35526439972fa9fdef4dca6a95a285\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89cf43337dd65773891148fae7e97baf3bad08bcc42d35ccbcb396924a5e1328\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5a044713f62e865404e3a08eb2b72e000eb4418bd86be24564fc1dff9c3fb8ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.798801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.798853 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.798861 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.798879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.798890 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.802036 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e737111-6155-41bb-8a02-7dd880eade01\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://109762c1c77a786a3c953ead1b65a17e9401f8ba205858ffc79f6e188d5005df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a8432f1dcc97ca9d30249542dfebc79f098859c0f7e04a637be764939fb6072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a8432f1dcc97ca9d30249542dfebc79f098859c0f7e04a637be764939fb6072\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.810551 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-dtdj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"95a893d4-4faa-40b2-b505-9698fe428ba8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e03e42cc2bd2d62e397d8138f1bfb5d4f3ef8ca22faec9ae48f6bcf5b22d964\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pswh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-dtdj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.827243 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9eb558c7-0e97-42c7-96e9-8b170ac2a3df\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6faa9eac1866fffe234666ebc0ccebaa65ed897a10df3c2f4c60af170a24ff10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f731c47cd6cbadce3d6caf107e6f7b47ccf52a192f27d5c88455d94ed51e724c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ba8d2a22f341cf2a156befb63aabda598800197b206ac09ceae36296b428d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af25a6897fe69ff57615b72ae31abd93d49a9dfd960f98842eaf8db1b327fe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c88a3bea343764365f32d13ed70b783845cce76318d2378c7bbcc2005e97326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:06:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b19c9dbf8cf2f44ddd32a80e600c191dfb1efb1ca81580a520c4f8eb767332a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f477eb4e1b649a5aa1f5e85f03691700b168b968e263e772a6c54f298e10b75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:56Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cddf104d24d1462376c31b95bf9f29f354d4d7c5bb60f0b3391b6f4d692b7970\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:06:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:06:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:06:54Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.838890 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1528b753084926155afff8e9a49f8be08e24697c358fc5079cd2ef0e88449\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.848763 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4877bb2bd2a893b384e8153ff7b20a81a4640200989e85efd4c654bb9ecf0c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.857137 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-q46tm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fed65bae-f1c4-4c97-bb6d-d4144fe2532b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a94e333403eaaf5f1c05153d18d284dda3a2cde1d727e5652613049041fe348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qln7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:16Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-q46tm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.867609 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3cc8fde5-4905-4fb1-b683-27ea4921b462\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b47fc0778ba4c5a1e12700735e6f9c52a7341b9eac61071607902a6ec8daf02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://545dcc1be1335a1acf93f16d2e5b4a266dce5b7e736b7c56b80bbf56b3248ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:07:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ghx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wkvpx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.876750 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"371016c8-5a23-427d-aa0a-0faa241d86a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:30Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v8l57\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:07:30Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-ql2vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.886691 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.896348 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:07:14Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.900837 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.900867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.900880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.900896 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:34 crc kubenswrapper[4772]: I0127 15:08:34.900908 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:34Z","lastTransitionTime":"2026-01-27T15:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.002803 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.002849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.002862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.002879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.002892 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.105576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.105618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.105627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.105643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.105654 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.207848 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.207897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.207908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.207926 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.207937 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.310000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.310033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.310043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.310057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.310067 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.413313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.413396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.413428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.413467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.413489 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.516310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.516351 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.516362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.516378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.516390 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.617902 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.617929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.617937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.617951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.617960 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.662896 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:35 crc kubenswrapper[4772]: E0127 15:08:35.663049 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.672098 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 21:09:33.025518333 +0000 UTC Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.720012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.720091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.720117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.720145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.720163 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.822329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.822361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.822371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.822385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.822395 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.925535 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.925641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.925687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.925707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:35 crc kubenswrapper[4772]: I0127 15:08:35.925721 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:35Z","lastTransitionTime":"2026-01-27T15:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.028618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.028658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.028667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.028683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.028692 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.131301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.131411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.131447 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.131476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.131497 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.234565 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.234609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.234620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.234635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.234644 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.336731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.336793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.336812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.336840 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.336858 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.440009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.440063 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.440076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.440094 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.440107 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.542655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.542706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.542729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.542750 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.542765 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.644656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.644714 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.644731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.644756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.644771 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.662519 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.662544 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.662676 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:36 crc kubenswrapper[4772]: E0127 15:08:36.662817 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:36 crc kubenswrapper[4772]: E0127 15:08:36.662882 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:36 crc kubenswrapper[4772]: E0127 15:08:36.662961 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.672680 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 12:06:33.848817085 +0000 UTC Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.746986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.747025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.747036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.747051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.747062 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.849363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.849399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.849409 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.849424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.849447 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.951892 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.951948 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.951960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.951977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:36 crc kubenswrapper[4772]: I0127 15:08:36.951990 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:36Z","lastTransitionTime":"2026-01-27T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.053849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.053888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.053899 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.053916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.053926 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.155546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.155602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.155616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.155633 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.155646 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.257243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.257282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.257294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.257311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.257324 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.359734 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.359784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.359800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.359815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.359826 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.461772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.461806 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.461814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.461827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.461835 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.564466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.564542 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.564555 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.564572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.564581 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.662926 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:37 crc kubenswrapper[4772]: E0127 15:08:37.663095 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.666968 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.667051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.667071 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.667096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.667115 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.672832 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:57:41.81792312 +0000 UTC Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.770220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.770284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.770299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.770322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.770336 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.872638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.872687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.872698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.872724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.872741 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.974754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.974789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.974796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.974810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:37 crc kubenswrapper[4772]: I0127 15:08:37.974821 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:37Z","lastTransitionTime":"2026-01-27T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.076662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.076705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.076714 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.076729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.076738 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.180037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.180077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.180086 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.180098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.180108 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.282402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.282445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.282493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.282514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.282525 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.385026 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.385101 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.385118 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.385143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.385158 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.487720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.487767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.487788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.487810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.487825 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.591492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.591539 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.591549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.591568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.591578 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.662035 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.662205 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:38 crc kubenswrapper[4772]: E0127 15:08:38.662295 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.662355 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:38 crc kubenswrapper[4772]: E0127 15:08:38.662523 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:38 crc kubenswrapper[4772]: E0127 15:08:38.662643 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.673094 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 07:18:27.231225909 +0000 UTC Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.694164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.694237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.694250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.694271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.694326 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.798101 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.798138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.798148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.798186 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.798204 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.901053 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.901102 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.901111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.901126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:38 crc kubenswrapper[4772]: I0127 15:08:38.901135 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:38Z","lastTransitionTime":"2026-01-27T15:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.003462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.003503 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.003519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.003542 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.003559 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.106494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.106534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.106544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.106560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.106569 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.209206 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.209293 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.209310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.209335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.209351 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.311703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.311732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.311740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.311753 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.311762 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.414641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.414693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.414707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.414726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.414739 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.517328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.517368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.517379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.517397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.517409 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.620849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.620914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.620933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.620956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.620968 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.662323 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:39 crc kubenswrapper[4772]: E0127 15:08:39.662964 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.663481 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:08:39 crc kubenswrapper[4772]: E0127 15:08:39.663753 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-n2khk_openshift-ovn-kubernetes(736264c8-cd18-479a-88ba-e1ec15dbfdae)\"" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.673274 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 08:03:33.417504168 +0000 UTC Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.724476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.724558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.724573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.724597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.724613 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.827579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.827642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.827654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.827675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.827691 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.930382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.930451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.930466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.930489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:39 crc kubenswrapper[4772]: I0127 15:08:39.930507 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:39Z","lastTransitionTime":"2026-01-27T15:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.032819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.032932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.032943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.032956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.032965 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.136798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.136835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.136843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.136857 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.136866 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.239402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.239446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.239459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.239474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.239482 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.342300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.342365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.342388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.342422 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.342442 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.444278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.444326 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.444341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.444360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.444372 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.546975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.547033 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.547043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.547057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.547066 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.649859 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.649904 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.649923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.649941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.649952 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.662335 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:40 crc kubenswrapper[4772]: E0127 15:08:40.662476 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.662513 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.662571 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:40 crc kubenswrapper[4772]: E0127 15:08:40.662623 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:40 crc kubenswrapper[4772]: E0127 15:08:40.662659 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.673857 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:21:21.841021375 +0000 UTC Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.751987 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.752024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.752032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.752046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.752055 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.854219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.854254 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.854263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.854276 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.854285 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.956629 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.956689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.956708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.956733 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:40 crc kubenswrapper[4772]: I0127 15:08:40.956750 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:40Z","lastTransitionTime":"2026-01-27T15:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.059258 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.059306 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.059316 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.059331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.059340 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.162811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.162849 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.162859 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.162874 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.162884 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.264955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.264995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.265008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.265025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.265037 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.367618 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.367682 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.367712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.367739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.367757 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.470897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.470935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.470943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.470956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.470964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.573330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.573363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.573371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.573384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.573393 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.662360 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:41 crc kubenswrapper[4772]: E0127 15:08:41.662497 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.674467 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 02:50:15.293223886 +0000 UTC Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.676047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.676079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.676090 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.676107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.676116 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.779636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.779698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.779716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.779745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.779762 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.883260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.883337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.883356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.883381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.883399 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.986147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.986245 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.986263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.986291 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:41 crc kubenswrapper[4772]: I0127 15:08:41.986309 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:41Z","lastTransitionTime":"2026-01-27T15:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.088389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.088435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.088446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.088466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.088479 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.190925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.191006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.191031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.191062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.191087 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.294884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.294927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.294936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.294953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.294964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.397413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.397459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.397469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.397487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.397499 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.499666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.499716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.499745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.499763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.499774 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.602455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.602498 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.602511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.602530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.602546 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.662709 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.662863 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:42 crc kubenswrapper[4772]: E0127 15:08:42.663033 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.663061 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:42 crc kubenswrapper[4772]: E0127 15:08:42.663589 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:42 crc kubenswrapper[4772]: E0127 15:08:42.663681 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.674650 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 11:58:18.536286688 +0000 UTC Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.705370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.705420 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.705433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.705449 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.705461 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.808581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.808643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.808654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.808669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.808679 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.911967 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.912068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.912091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.912113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:42 crc kubenswrapper[4772]: I0127 15:08:42.912127 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:42Z","lastTransitionTime":"2026-01-27T15:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.015519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.015558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.015567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.015582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.015591 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.108866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.108906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.108918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.108938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.108955 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.122486 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.129684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.129725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.129735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.129758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.129772 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.142909 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.146929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.146964 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.146975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.147000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.147011 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.161569 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.165930 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.165977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.165989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.166007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.166020 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.178019 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.181381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.181418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.181426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.181441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.181450 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.191904 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3e5a8897-2f3c-4c79-87ef-bcf3ebf59cdb\\\",\\\"systemUUID\\\":\\\"3933c4f3-43c9-48b4-998d-ee6c7e3cb9de\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:08:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.192019 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.193288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.193323 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.193335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.193350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.193361 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.295917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.295962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.295973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.295991 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.296002 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.398454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.398510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.398522 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.398542 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.398553 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.501532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.501574 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.501585 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.501599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.501608 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.604244 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.604319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.604346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.604365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.604382 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.661896 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:43 crc kubenswrapper[4772]: E0127 15:08:43.662217 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.675212 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:30:16.891761801 +0000 UTC Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.706903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.706941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.706952 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.706968 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.706979 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.809466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.809532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.809549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.809567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.809579 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.912226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.912274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.912287 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.912305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:43 crc kubenswrapper[4772]: I0127 15:08:43.912318 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:43Z","lastTransitionTime":"2026-01-27T15:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.015335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.015381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.015391 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.015404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.015414 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.117455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.117735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.117814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.117885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.117950 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.220228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.220261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.220288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.220314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.220326 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.323340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.323395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.323412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.323431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.323442 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.426054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.426102 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.426115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.426132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.426144 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.529379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.529433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.529445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.529462 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.529471 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.633351 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.633413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.633425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.633448 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.633461 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.662715 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.662830 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.662908 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:44 crc kubenswrapper[4772]: E0127 15:08:44.663139 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:44 crc kubenswrapper[4772]: E0127 15:08:44.663230 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:44 crc kubenswrapper[4772]: E0127 15:08:44.663291 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.676062 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 08:14:04.428354454 +0000 UTC Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.703729 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dtdj6" podStartSLOduration=88.703678825 podStartE2EDuration="1m28.703678825s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.703432318 +0000 UTC m=+110.684041446" watchObservedRunningTime="2026-01-27 15:08:44.703678825 +0000 UTC m=+110.684287923" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.736401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.736510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.736525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.736550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.736564 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.739089 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=90.739065049 podStartE2EDuration="1m30.739065049s" podCreationTimestamp="2026-01-27 15:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.735674421 +0000 UTC m=+110.716283519" watchObservedRunningTime="2026-01-27 15:08:44.739065049 +0000 UTC m=+110.719674167" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.824393 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q46tm" podStartSLOduration=88.824372436 podStartE2EDuration="1m28.824372436s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.810629889 +0000 UTC m=+110.791239007" watchObservedRunningTime="2026-01-27 15:08:44.824372436 +0000 UTC m=+110.804981534" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.834038 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wkvpx" podStartSLOduration=87.834017065 podStartE2EDuration="1m27.834017065s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.824079088 +0000 UTC m=+110.804688196" watchObservedRunningTime="2026-01-27 15:08:44.834017065 +0000 UTC m=+110.814626163" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.838985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.839264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.839348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.839440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.839521 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.861296 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.861279224 podStartE2EDuration="57.861279224s" podCreationTimestamp="2026-01-27 15:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.860983105 +0000 UTC m=+110.841592203" watchObservedRunningTime="2026-01-27 15:08:44.861279224 +0000 UTC m=+110.841888322" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.887269 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podStartSLOduration=88.887250985 podStartE2EDuration="1m28.887250985s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.886636117 +0000 UTC m=+110.867245225" watchObservedRunningTime="2026-01-27 15:08:44.887250985 +0000 UTC m=+110.867860093" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.922610 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.922580067 podStartE2EDuration="1m30.922580067s" podCreationTimestamp="2026-01-27 15:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.921960849 +0000 UTC m=+110.902569947" watchObservedRunningTime="2026-01-27 15:08:44.922580067 +0000 UTC m=+110.903189165" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.934072 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.934049168 podStartE2EDuration="19.934049168s" podCreationTimestamp="2026-01-27 15:08:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.932767431 +0000 UTC m=+110.913376529" watchObservedRunningTime="2026-01-27 15:08:44.934049168 +0000 UTC m=+110.914658266" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.942678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.942728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.942738 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.942759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.942772 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:44Z","lastTransitionTime":"2026-01-27T15:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:44 crc kubenswrapper[4772]: I0127 15:08:44.980956 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c7pdz" podStartSLOduration=88.980932184 podStartE2EDuration="1m28.980932184s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:44.979952356 +0000 UTC m=+110.960561454" watchObservedRunningTime="2026-01-27 15:08:44.980932184 +0000 UTC m=+110.961541282" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.026103 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x7jwx" podStartSLOduration=89.02608288 podStartE2EDuration="1m29.02608288s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:45.009323996 +0000 UTC m=+110.989933094" watchObservedRunningTime="2026-01-27 15:08:45.02608288 +0000 UTC m=+111.006691978" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.045188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.045240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.045250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.045268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.045279 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.147771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.147813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.147825 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.147843 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.147854 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.251688 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.251740 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.251751 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.251774 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.251785 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.356097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.356132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.356140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.356154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.356179 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.459841 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.459912 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.459933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.459958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.459977 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.563725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.563793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.563809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.563836 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.563850 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.662494 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:45 crc kubenswrapper[4772]: E0127 15:08:45.663230 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.666711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.666767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.666783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.666801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.666815 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.676841 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 16:28:44.801174951 +0000 UTC Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.769761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.769801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.769810 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.769827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.769836 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.872801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.872862 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.872880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.872900 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.872915 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.976057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.976099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.976108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.976125 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:45 crc kubenswrapper[4772]: I0127 15:08:45.976136 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:45Z","lastTransitionTime":"2026-01-27T15:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.079111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.079337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.079377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.079408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.079430 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.181439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.181470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.181480 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.181496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.181507 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.284012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.284046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.284053 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.284068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.284078 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.386336 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.386386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.386399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.386418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.386429 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.490089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.490159 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.490190 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.490211 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.490223 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.592336 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.592369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.592378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.592393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.592403 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.662887 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.662961 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.662889 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:46 crc kubenswrapper[4772]: E0127 15:08:46.663015 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:46 crc kubenswrapper[4772]: E0127 15:08:46.663214 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:46 crc kubenswrapper[4772]: E0127 15:08:46.663335 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.678044 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 09:43:43.180326489 +0000 UTC Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.694996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.695052 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.695060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.695073 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.695083 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.797530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.797575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.797583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.797597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.797606 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.899664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.899700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.899709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.899724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:46 crc kubenswrapper[4772]: I0127 15:08:46.899734 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:46Z","lastTransitionTime":"2026-01-27T15:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.001319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.001364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.001376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.001395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.001407 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.104034 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.104080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.104091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.104108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.104120 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.206555 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.206588 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.206596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.206608 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.206617 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.309333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.309365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.309375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.309405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.309419 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.412314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.412346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.412355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.412369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.412381 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.515221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.515259 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.515268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.515283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.515292 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.618159 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.618217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.618227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.618240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.618249 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.662252 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:47 crc kubenswrapper[4772]: E0127 15:08:47.662388 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.679143 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 15:32:06.965387102 +0000 UTC Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.720107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.720153 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.720163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.720211 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.720224 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.822884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.822922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.822939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.822954 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.822966 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.925105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.925136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.925144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.925158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:47 crc kubenswrapper[4772]: I0127 15:08:47.925199 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:47Z","lastTransitionTime":"2026-01-27T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.027308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.027338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.027348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.027365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.027376 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.484751 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.484787 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.484797 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.484812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.484821 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.586677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.586720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.586731 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.586746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.586757 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.662312 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.662353 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:48 crc kubenswrapper[4772]: E0127 15:08:48.662643 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.662729 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:48 crc kubenswrapper[4772]: E0127 15:08:48.662794 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:48 crc kubenswrapper[4772]: E0127 15:08:48.662924 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.679447 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:17:02.770680884 +0000 UTC Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.689827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.689863 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.689871 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.689886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.689899 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.791966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.792003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.792014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.792027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.792036 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.894488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.894548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.894560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.894577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.894590 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.997344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.997386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.997396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.997410 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:48 crc kubenswrapper[4772]: I0127 15:08:48.997420 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:48Z","lastTransitionTime":"2026-01-27T15:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.099594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.099634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.099644 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.099659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.099670 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.202529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.203254 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.203283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.203306 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.203319 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.307075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.307132 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.307148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.307189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.307206 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.410499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.410566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.410597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.410624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.410648 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.513396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.513446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.513459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.513475 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.513485 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.616088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.616208 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.616258 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.616281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.616297 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.662775 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:49 crc kubenswrapper[4772]: E0127 15:08:49.662955 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.680229 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 17:09:09.640463772 +0000 UTC Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.718963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.719005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.719028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.719048 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.719061 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.822307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.822359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.822375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.822400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.822418 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.925958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.926007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.926028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.926055 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:49 crc kubenswrapper[4772]: I0127 15:08:49.926067 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:49Z","lastTransitionTime":"2026-01-27T15:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.029156 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.029263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.029288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.029319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.029340 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.132640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.132721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.132737 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.132763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.132778 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.235298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.235385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.235400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.235419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.235433 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.338768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.338835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.338853 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.338880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.338893 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.443655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.443729 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.443744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.443766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.443783 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.546888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.546968 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.546990 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.547030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.547070 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.649227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.649267 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.649280 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.649297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.649310 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.662495 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:50 crc kubenswrapper[4772]: E0127 15:08:50.662656 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.662887 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:50 crc kubenswrapper[4772]: E0127 15:08:50.662974 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.663292 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:50 crc kubenswrapper[4772]: E0127 15:08:50.663366 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.681285 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 11:11:19.139125138 +0000 UTC Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.751461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.751493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.751501 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.751515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.751525 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.853534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.853571 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.853580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.853596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.853607 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.956895 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.956944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.956956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.956974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:50 crc kubenswrapper[4772]: I0127 15:08:50.956986 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:50Z","lastTransitionTime":"2026-01-27T15:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.059347 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.059407 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.059428 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.059450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.059465 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.162634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.162678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.162689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.162707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.162720 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.265112 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.265144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.265186 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.265204 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.265214 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.369492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.369569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.369589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.369613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.369631 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.472908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.472946 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.472955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.472973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.472985 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.499121 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/1.log" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.499781 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/0.log" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.499912 4772 generic.go:334] "Generic (PLEG): container finished" podID="87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8" containerID="9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356" exitCode=1 Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.500039 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerDied","Data":"9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.500148 4772 scope.go:117] "RemoveContainer" containerID="ba06c066217d03c059fbd555552d87574ea4ec17f72937330155f4bfbc4e3a33" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.500841 4772 scope.go:117] "RemoveContainer" containerID="9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356" Jan 27 15:08:51 crc kubenswrapper[4772]: E0127 15:08:51.501118 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-x7jwx_openshift-multus(87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8)\"" pod="openshift-multus/multus-x7jwx" podUID="87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.528355 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=97.528308138 podStartE2EDuration="1m37.528308138s" podCreationTimestamp="2026-01-27 15:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:45.026014918 +0000 UTC m=+111.006624016" watchObservedRunningTime="2026-01-27 15:08:51.528308138 +0000 UTC m=+117.508917236" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.583591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.583633 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.583642 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.583660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.583669 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.662884 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:51 crc kubenswrapper[4772]: E0127 15:08:51.663085 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.682552 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 14:03:21.587268973 +0000 UTC Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.687452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.687507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.687527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.687548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.687559 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.790476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.790544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.790560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.790585 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.790602 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.893482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.893531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.893543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.893559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.893576 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.995791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.995838 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.995852 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.995868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:51 crc kubenswrapper[4772]: I0127 15:08:51.995879 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:51Z","lastTransitionTime":"2026-01-27T15:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.099535 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.099614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.099631 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.099656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.099676 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.202988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.203039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.203051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.203070 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.203083 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.305566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.305654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.305667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.305683 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.305695 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.408068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.408113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.408124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.408139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.408150 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.504730 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/1.log" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.510275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.510315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.510329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.510344 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.510355 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.613121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.613155 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.613163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.613193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.613201 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.662897 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.662910 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.662926 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:52 crc kubenswrapper[4772]: E0127 15:08:52.663079 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:52 crc kubenswrapper[4772]: E0127 15:08:52.663305 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:52 crc kubenswrapper[4772]: E0127 15:08:52.663466 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.685136 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 17:57:49.888874258 +0000 UTC Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.716591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.716829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.716917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.717040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.717139 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.819951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.820003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.820014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.820032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.820043 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.922657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.922695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.922704 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.922718 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:52 crc kubenswrapper[4772]: I0127 15:08:52.922727 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:52Z","lastTransitionTime":"2026-01-27T15:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.025235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.025273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.025283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.025297 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.025308 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:53Z","lastTransitionTime":"2026-01-27T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.128406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.128446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.128454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.128468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.128479 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:53Z","lastTransitionTime":"2026-01-27T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.231133 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.231189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.231199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.231213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.231224 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:53Z","lastTransitionTime":"2026-01-27T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.233434 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.233477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.233489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.233508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.233519 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:08:53Z","lastTransitionTime":"2026-01-27T15:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.276892 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr"] Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.277285 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.280545 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.280918 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.281050 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.281192 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.324561 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.324830 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.324975 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.325073 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.325192 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.426123 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.426982 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.427019 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.427042 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.427116 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.427260 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.427268 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.428743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.432256 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.444491 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/765e5a84-2a7d-43fa-a02c-e3800d9b6fd8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-jv7kr\" (UID: \"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.596103 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" Jan 27 15:08:53 crc kubenswrapper[4772]: W0127 15:08:53.617752 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765e5a84_2a7d_43fa_a02c_e3800d9b6fd8.slice/crio-c223f5a24ed190a2600efb87a34c7e266fac233ace6d0338754e7f0d86e41ffa WatchSource:0}: Error finding container c223f5a24ed190a2600efb87a34c7e266fac233ace6d0338754e7f0d86e41ffa: Status 404 returned error can't find the container with id c223f5a24ed190a2600efb87a34c7e266fac233ace6d0338754e7f0d86e41ffa Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.661837 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:53 crc kubenswrapper[4772]: E0127 15:08:53.662057 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.662944 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.685631 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 04:55:54.576859208 +0000 UTC Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.685679 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 27 15:08:53 crc kubenswrapper[4772]: I0127 15:08:53.693733 4772 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.514027 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/3.log" Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.517180 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerStarted","Data":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.517604 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.519012 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" event={"ID":"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8","Type":"ContainerStarted","Data":"2d1bb5111d302a6e71b243f0bcee9a98b6fc9a0bebc10bd5befbef42ffcdad72"} Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.519053 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" event={"ID":"765e5a84-2a7d-43fa-a02c-e3800d9b6fd8","Type":"ContainerStarted","Data":"c223f5a24ed190a2600efb87a34c7e266fac233ace6d0338754e7f0d86e41ffa"} Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.542838 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podStartSLOduration=98.542822159 podStartE2EDuration="1m38.542822159s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:54.54214716 +0000 UTC m=+120.522756268" watchObservedRunningTime="2026-01-27 15:08:54.542822159 +0000 UTC m=+120.523431257" Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.575267 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-jv7kr" podStartSLOduration=98.575247587 podStartE2EDuration="1m38.575247587s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:08:54.555581178 +0000 UTC m=+120.536190286" watchObservedRunningTime="2026-01-27 15:08:54.575247587 +0000 UTC m=+120.555856685" Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.575569 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ql2vx"] Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.575667 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:54 crc kubenswrapper[4772]: E0127 15:08:54.575746 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:54 crc kubenswrapper[4772]: E0127 15:08:54.662738 4772 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.662737 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:54 crc kubenswrapper[4772]: I0127 15:08:54.662782 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:54 crc kubenswrapper[4772]: E0127 15:08:54.663658 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:54 crc kubenswrapper[4772]: E0127 15:08:54.663732 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:54 crc kubenswrapper[4772]: E0127 15:08:54.750500 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:08:55 crc kubenswrapper[4772]: I0127 15:08:55.662969 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:55 crc kubenswrapper[4772]: E0127 15:08:55.663118 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:55 crc kubenswrapper[4772]: I0127 15:08:55.662964 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:55 crc kubenswrapper[4772]: E0127 15:08:55.663629 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:56 crc kubenswrapper[4772]: I0127 15:08:56.662433 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:56 crc kubenswrapper[4772]: E0127 15:08:56.662581 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:56 crc kubenswrapper[4772]: I0127 15:08:56.662636 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:56 crc kubenswrapper[4772]: E0127 15:08:56.662836 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:57 crc kubenswrapper[4772]: I0127 15:08:57.662451 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:57 crc kubenswrapper[4772]: I0127 15:08:57.662628 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:57 crc kubenswrapper[4772]: E0127 15:08:57.662743 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:57 crc kubenswrapper[4772]: E0127 15:08:57.662923 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:58 crc kubenswrapper[4772]: I0127 15:08:58.662898 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:08:58 crc kubenswrapper[4772]: I0127 15:08:58.663002 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:08:58 crc kubenswrapper[4772]: E0127 15:08:58.663111 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:08:58 crc kubenswrapper[4772]: E0127 15:08:58.663295 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:08:59 crc kubenswrapper[4772]: I0127 15:08:59.662589 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:08:59 crc kubenswrapper[4772]: E0127 15:08:59.662730 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:08:59 crc kubenswrapper[4772]: I0127 15:08:59.662589 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:08:59 crc kubenswrapper[4772]: E0127 15:08:59.662828 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:08:59 crc kubenswrapper[4772]: E0127 15:08:59.751954 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:09:00 crc kubenswrapper[4772]: I0127 15:09:00.662063 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:00 crc kubenswrapper[4772]: E0127 15:09:00.662347 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:09:00 crc kubenswrapper[4772]: I0127 15:09:00.662539 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:00 crc kubenswrapper[4772]: E0127 15:09:00.662773 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:09:01 crc kubenswrapper[4772]: I0127 15:09:01.662626 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:09:01 crc kubenswrapper[4772]: I0127 15:09:01.662653 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:09:01 crc kubenswrapper[4772]: E0127 15:09:01.662795 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:09:01 crc kubenswrapper[4772]: E0127 15:09:01.662922 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:09:02 crc kubenswrapper[4772]: I0127 15:09:02.662884 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:02 crc kubenswrapper[4772]: E0127 15:09:02.663029 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:09:02 crc kubenswrapper[4772]: I0127 15:09:02.663916 4772 scope.go:117] "RemoveContainer" containerID="9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356" Jan 27 15:09:02 crc kubenswrapper[4772]: I0127 15:09:02.664582 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:02 crc kubenswrapper[4772]: E0127 15:09:02.664871 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:09:03 crc kubenswrapper[4772]: I0127 15:09:03.547579 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/1.log" Jan 27 15:09:03 crc kubenswrapper[4772]: I0127 15:09:03.547894 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerStarted","Data":"a5fee45d3fc79618abfe1fb780f6741fbf20558f07d7edf5c931f442a9c1c7dd"} Jan 27 15:09:03 crc kubenswrapper[4772]: I0127 15:09:03.662631 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:09:03 crc kubenswrapper[4772]: I0127 15:09:03.662657 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:09:03 crc kubenswrapper[4772]: E0127 15:09:03.662817 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:09:03 crc kubenswrapper[4772]: E0127 15:09:03.662961 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ql2vx" podUID="371016c8-5a23-427d-aa0a-0faa241d86a7" Jan 27 15:09:04 crc kubenswrapper[4772]: I0127 15:09:04.662781 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:04 crc kubenswrapper[4772]: E0127 15:09:04.664208 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:09:04 crc kubenswrapper[4772]: I0127 15:09:04.664367 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:04 crc kubenswrapper[4772]: E0127 15:09:04.664549 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:09:05 crc kubenswrapper[4772]: I0127 15:09:05.662287 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:09:05 crc kubenswrapper[4772]: I0127 15:09:05.662391 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:09:05 crc kubenswrapper[4772]: I0127 15:09:05.665366 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 15:09:05 crc kubenswrapper[4772]: I0127 15:09:05.665554 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 15:09:05 crc kubenswrapper[4772]: I0127 15:09:05.665896 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 15:09:05 crc kubenswrapper[4772]: I0127 15:09:05.668029 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 15:09:06 crc kubenswrapper[4772]: I0127 15:09:06.662886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:06 crc kubenswrapper[4772]: I0127 15:09:06.663010 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:06 crc kubenswrapper[4772]: I0127 15:09:06.665317 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 15:09:06 crc kubenswrapper[4772]: I0127 15:09:06.665502 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.823085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.881790 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6pclx"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.882505 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.882813 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.883564 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.885435 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2h2z8"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.886035 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.887758 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mfh29"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.888402 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.889140 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.889671 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.889695 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.890334 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.899110 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.899990 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.901498 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.903256 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.903648 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.904013 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.905718 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.913067 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.913387 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.913631 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.913773 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.913914 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.913937 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.914080 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.914160 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.914342 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.914433 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.914507 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.914555 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.916067 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.916517 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.916838 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bck4j"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.917315 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.917546 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.917825 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.917917 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.918224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.918481 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.919340 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921022 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921190 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921289 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921317 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921376 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921379 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921553 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921574 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921624 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921713 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7qfrl"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.922195 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-npths"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.922475 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921317 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.922776 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.921744 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.923726 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.924231 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.924619 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.925525 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926109 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926254 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926397 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926506 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926642 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926798 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.926842 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927012 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927107 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927212 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927304 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927365 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927524 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927692 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.927728 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tgmck"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.928104 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.928434 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.928646 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-vswtw"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.928943 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.929993 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6pclx"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.945869 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.945919 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mfh29"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.945929 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.946850 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.947444 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.947467 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.947586 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.947752 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.947920 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948031 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948059 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948131 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948206 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948293 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948392 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948436 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948299 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948398 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948502 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948670 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948741 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.948766 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950469 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-encryption-config\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950508 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-audit-policies\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950532 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/34e7a553-e424-472e-a143-76e7e08e57aa-node-pullsecrets\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950556 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-config\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950579 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-images\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950598 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-client-ca\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950620 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e7a553-e424-472e-a143-76e7e08e57aa-audit-dir\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-etcd-client\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950662 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkjwx\" (UniqueName: \"kubernetes.io/projected/d1aba7eb-5916-4023-90f2-10152ad89b63-kube-api-access-pkjwx\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950704 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-serving-cert\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950723 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-serving-cert\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950746 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-client-ca\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-audit\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950807 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950827 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-encryption-config\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950850 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl5hf\" (UniqueName: \"kubernetes.io/projected/34e7a553-e424-472e-a143-76e7e08e57aa-kube-api-access-tl5hf\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950874 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24gk\" (UniqueName: \"kubernetes.io/projected/8d519648-7eaa-49bb-9a09-bd91d09d98c0-kube-api-access-m24gk\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950905 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-serving-cert\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950924 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1aba7eb-5916-4023-90f2-10152ad89b63-audit-dir\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950947 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-etcd-serving-ca\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950972 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-etcd-client\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.950998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951019 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qn6\" (UniqueName: \"kubernetes.io/projected/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-kube-api-access-29qn6\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c7gp\" (UniqueName: \"kubernetes.io/projected/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-kube-api-access-8c7gp\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951086 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-config\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-config\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951139 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d519648-7eaa-49bb-9a09-bd91d09d98c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-image-import-ca\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951290 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.951312 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-config\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.952816 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.956417 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7qfrl"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.957726 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-npths"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.958114 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2h2z8"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.959039 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-crlcr"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.974863 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.989312 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fgw98"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.989899 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.989881 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.989907 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.991915 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.992097 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.992311 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.992347 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.992409 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.992460 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.992642 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bck4j"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.995773 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.995760 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jplbk"] Jan 27 15:09:13 crc kubenswrapper[4772]: I0127 15:09:13.998935 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.000463 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.000699 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.000869 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.003551 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.004417 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.004823 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.004911 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005076 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005051 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005085 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005345 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005452 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005514 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.005632 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.007343 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.007364 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bhgv8"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.008293 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vswtw"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.008394 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.011942 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7k7sg"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.012652 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.012734 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.014139 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.014896 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.021537 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.021759 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.025006 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.025709 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pwmhd"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.026142 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.026452 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.027085 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.027290 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.028006 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.028712 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.035828 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.036254 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.036364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.037707 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.051983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-serving-cert\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052024 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-etcd-serving-ca\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052054 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1aba7eb-5916-4023-90f2-10152ad89b63-audit-dir\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052075 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-etcd-client\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052100 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052126 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29qn6\" (UniqueName: \"kubernetes.io/projected/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-kube-api-access-29qn6\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052148 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c7gp\" (UniqueName: \"kubernetes.io/projected/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-kube-api-access-8c7gp\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052215 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-config\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052250 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-config\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052273 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d519648-7eaa-49bb-9a09-bd91d09d98c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052293 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-image-import-ca\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-config\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-config\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052399 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-encryption-config\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052424 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-service-ca-bundle\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-audit-policies\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052471 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/34e7a553-e424-472e-a143-76e7e08e57aa-node-pullsecrets\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052493 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qztp\" (UniqueName: \"kubernetes.io/projected/39dd090e-b988-4c36-88f0-c0cb28a23e8b-kube-api-access-5qztp\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052518 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-config\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052540 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-images\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052560 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-client-ca\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052581 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e7a553-e424-472e-a143-76e7e08e57aa-audit-dir\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052604 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-etcd-client\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052648 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkjwx\" (UniqueName: \"kubernetes.io/projected/d1aba7eb-5916-4023-90f2-10152ad89b63-kube-api-access-pkjwx\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-serving-cert\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-serving-cert\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052740 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39dd090e-b988-4c36-88f0-c0cb28a23e8b-serving-cert\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052764 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-client-ca\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052786 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-audit\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052824 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052845 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-encryption-config\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052867 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl5hf\" (UniqueName: \"kubernetes.io/projected/34e7a553-e424-472e-a143-76e7e08e57aa-kube-api-access-tl5hf\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.052889 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24gk\" (UniqueName: \"kubernetes.io/projected/8d519648-7eaa-49bb-9a09-bd91d09d98c0-kube-api-access-m24gk\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.053489 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-audit-policies\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.054208 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/34e7a553-e424-472e-a143-76e7e08e57aa-node-pullsecrets\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.063507 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-serving-cert\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.064496 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-client-ca\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.065053 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-audit\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.065470 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-trusted-ca-bundle\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.065700 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d1aba7eb-5916-4023-90f2-10152ad89b63-audit-dir\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.066674 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.066800 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-etcd-client\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.067069 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d1aba7eb-5916-4023-90f2-10152ad89b63-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.067136 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-etcd-serving-ca\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.067637 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-config\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.067710 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tgmck"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.067750 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.068071 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d519648-7eaa-49bb-9a09-bd91d09d98c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.081329 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-image-import-ca\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.086341 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d1aba7eb-5916-4023-90f2-10152ad89b63-encryption-config\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.086765 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e7a553-e424-472e-a143-76e7e08e57aa-config\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.087545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-images\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.090956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-config\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.091124 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-etcd-client\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.095031 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-serving-cert\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.095276 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34e7a553-e424-472e-a143-76e7e08e57aa-audit-dir\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.095663 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jplbk"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.095697 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.095857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.096594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-client-ca\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.098220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-config\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.098713 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-serving-cert\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.100242 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/34e7a553-e424-472e-a143-76e7e08e57aa-encryption-config\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.100278 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.100620 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.102014 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.112336 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.112537 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.112917 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.113075 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.117334 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.117361 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bhgv8"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.117371 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.117449 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.121917 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.123190 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.123213 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fgw98"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.124368 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.124529 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-djmb4"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.124970 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.125024 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4lj2h"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.125623 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.126191 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.126769 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.127299 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fw6bh"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.127667 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.128772 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.131441 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.132565 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.134515 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.135063 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.139420 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.141392 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.143086 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.144643 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.148884 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.149494 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.150884 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9wv77"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.151501 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.152267 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-24dmv"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.153263 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.153443 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.153493 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39dd090e-b988-4c36-88f0-c0cb28a23e8b-serving-cert\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.153577 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-config\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.153594 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-service-ca-bundle\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.153615 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qztp\" (UniqueName: \"kubernetes.io/projected/39dd090e-b988-4c36-88f0-c0cb28a23e8b-kube-api-access-5qztp\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.154512 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.154576 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-service-ca-bundle\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.154661 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39dd090e-b988-4c36-88f0-c0cb28a23e8b-config\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.156949 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.157446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39dd090e-b988-4c36-88f0-c0cb28a23e8b-serving-cert\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.158327 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.159071 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.160467 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.161146 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.161867 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.162548 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.167344 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.167787 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.169196 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.171583 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-crlcr"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.173709 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pwmhd"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.174816 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-djmb4"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.175474 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.178967 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-24dmv"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.180955 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.184154 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.187285 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.189498 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9wv77"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.192854 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4lj2h"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.194399 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.195942 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.197067 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.198314 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-58dhr"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.205094 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.205236 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.208125 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.208430 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.211811 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fw6bh"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.213113 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.214304 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.215637 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.216798 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.217782 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hbbxh"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.218837 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.218974 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hbbxh"] Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.224816 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.244584 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.264885 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.286489 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.307594 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.325187 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.346441 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.365673 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.386535 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.405211 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.425878 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.445200 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.485806 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.505810 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.527783 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.546161 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.567092 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.585673 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.607813 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.626750 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.645673 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.665841 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.685976 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.704784 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.745310 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.765041 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.784857 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.806020 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.825332 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.868469 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24gk\" (UniqueName: \"kubernetes.io/projected/8d519648-7eaa-49bb-9a09-bd91d09d98c0-kube-api-access-m24gk\") pod \"route-controller-manager-6576b87f9c-r9glz\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.928839 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.937659 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkjwx\" (UniqueName: \"kubernetes.io/projected/d1aba7eb-5916-4023-90f2-10152ad89b63-kube-api-access-pkjwx\") pod \"apiserver-7bbb656c7d-wk7gd\" (UID: \"d1aba7eb-5916-4023-90f2-10152ad89b63\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.946076 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.946978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c7gp\" (UniqueName: \"kubernetes.io/projected/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-kube-api-access-8c7gp\") pod \"controller-manager-879f6c89f-6pclx\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.948184 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl5hf\" (UniqueName: \"kubernetes.io/projected/34e7a553-e424-472e-a143-76e7e08e57aa-kube-api-access-tl5hf\") pod \"apiserver-76f77b778f-2h2z8\" (UID: \"34e7a553-e424-472e-a143-76e7e08e57aa\") " pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.948924 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29qn6\" (UniqueName: \"kubernetes.io/projected/625f7e2d-0e3f-4c2c-8f49-b09fc3638536-kube-api-access-29qn6\") pod \"machine-api-operator-5694c8668f-mfh29\" (UID: \"625f7e2d-0e3f-4c2c-8f49-b09fc3638536\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.965619 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 15:09:14 crc kubenswrapper[4772]: I0127 15:09:14.985911 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.005781 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.025674 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.045614 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.064900 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.085396 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.102202 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz"] Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.105053 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.126020 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.130489 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.143661 4772 request.go:700] Waited for 1.018420389s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca-operator/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.145888 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.151087 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.155038 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.165814 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.196215 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.200577 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.205643 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.228578 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.249925 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.275045 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.287550 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.293704 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6pclx"] Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.304870 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.326131 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.333580 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-2h2z8"] Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.345378 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.364643 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.380018 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd"] Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.386608 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.406266 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.415838 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mfh29"] Jan 27 15:09:15 crc kubenswrapper[4772]: W0127 15:09:15.422829 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod625f7e2d_0e3f_4c2c_8f49_b09fc3638536.slice/crio-810797dbb78d0461ed7aa6192665d20e18e1d6246178b7721049b2d5b19a90b6 WatchSource:0}: Error finding container 810797dbb78d0461ed7aa6192665d20e18e1d6246178b7721049b2d5b19a90b6: Status 404 returned error can't find the container with id 810797dbb78d0461ed7aa6192665d20e18e1d6246178b7721049b2d5b19a90b6 Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.424444 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.445666 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.465482 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.485296 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.505561 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.526235 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.544779 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.565085 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.585476 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.588274 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" event={"ID":"625f7e2d-0e3f-4c2c-8f49-b09fc3638536","Type":"ContainerStarted","Data":"b5d5b46006fddbc4deae6b1d3905bab70e2b1c8b23766630fb99d618a421958a"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.588352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" event={"ID":"625f7e2d-0e3f-4c2c-8f49-b09fc3638536","Type":"ContainerStarted","Data":"810797dbb78d0461ed7aa6192665d20e18e1d6246178b7721049b2d5b19a90b6"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.589295 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" event={"ID":"3dfd9a91-e760-4c80-96e6-ca6525aa86b8","Type":"ContainerStarted","Data":"bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.589336 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" event={"ID":"3dfd9a91-e760-4c80-96e6-ca6525aa86b8","Type":"ContainerStarted","Data":"380eb12e1295d0270de3d27b76c2692262e194cd8678145268c2765050e2b23e"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.589631 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.591973 4772 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6pclx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.592020 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.593801 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" event={"ID":"34e7a553-e424-472e-a143-76e7e08e57aa","Type":"ContainerStarted","Data":"10dc65c23141e7590f682165f4eacee477055ad9f3dde4720a2c88ce24d8486f"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.596424 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" event={"ID":"d1aba7eb-5916-4023-90f2-10152ad89b63","Type":"ContainerStarted","Data":"07119c87526ca057f6f02dc0420297e1a13ea1238c1309af265353e4d0396523"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.598560 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" event={"ID":"8d519648-7eaa-49bb-9a09-bd91d09d98c0","Type":"ContainerStarted","Data":"3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.598611 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" event={"ID":"8d519648-7eaa-49bb-9a09-bd91d09d98c0","Type":"ContainerStarted","Data":"326c33fb529962a602f8dfd5dbe7dcbd0ebb132fe4709244f27812007b261a68"} Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.598956 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.601052 4772 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-r9glz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.601096 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" podUID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.605331 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.625243 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.647498 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.665653 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.685324 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.704946 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.724997 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.747132 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.764980 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.786436 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.824752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qztp\" (UniqueName: \"kubernetes.io/projected/39dd090e-b988-4c36-88f0-c0cb28a23e8b-kube-api-access-5qztp\") pod \"authentication-operator-69f744f599-tgmck\" (UID: \"39dd090e-b988-4c36-88f0-c0cb28a23e8b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.825747 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.845841 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.866226 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.885651 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.905257 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.925419 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.946606 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.956621 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.965515 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 15:09:15 crc kubenswrapper[4772]: I0127 15:09:15.986953 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.006091 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.027434 4772 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.050753 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.065242 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.168261 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-tgmck"] Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.172995 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-trusted-ca\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173072 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-policies\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173158 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173228 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd05c164-83f7-4ebe-bbe8-9db6707741c5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173254 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173300 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/06cdc094-b372-4016-bc5e-4c15a28e032e-kube-api-access-l5f6v\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173335 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173384 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-oauth-serving-cert\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173405 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/192d0f8f-10f9-43e2-a24a-2019aae0db44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173447 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4225ddc-bdcd-4158-811b-113234d0c3d0-service-ca-bundle\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173470 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-metrics-certs\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173491 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192d0f8f-10f9-43e2-a24a-2019aae0db44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173534 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173652 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z5pj\" (UniqueName: \"kubernetes.io/projected/c4225ddc-bdcd-4158-811b-113234d0c3d0-kube-api-access-5z5pj\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.173880 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ee07e18c-9f40-41c3-b2fb-05fd325976e4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.174021 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:16.673950135 +0000 UTC m=+142.654559343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.177535 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mswpx\" (UniqueName: \"kubernetes.io/projected/2c9eef7e-3996-45b6-ab7b-50d319dc1117-kube-api-access-mswpx\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.177606 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3150273d-63f7-4908-bcc5-2403e123d1e7-config-volume\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.177927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg5jv\" (UniqueName: \"kubernetes.io/projected/3150273d-63f7-4908-bcc5-2403e123d1e7-kube-api-access-dg5jv\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.178046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j667k\" (UniqueName: \"kubernetes.io/projected/5276546c-f731-4bd0-bb93-b5cd19b0992c-kube-api-access-j667k\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.178088 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.178705 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmgkc\" (UniqueName: \"kubernetes.io/projected/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-kube-api-access-vmgkc\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.178767 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9eef7e-3996-45b6-ab7b-50d319dc1117-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.178891 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-registry-certificates\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.178949 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-stats-auth\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.179006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2zfh\" (UniqueName: \"kubernetes.io/projected/ce4b4e2e-496b-4334-8736-db4f25473731-kube-api-access-k2zfh\") pod \"dns-operator-744455d44c-bhgv8\" (UID: \"ce4b4e2e-496b-4334-8736-db4f25473731\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5276546c-f731-4bd0-bb93-b5cd19b0992c-trusted-ca\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180469 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180861 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-oauth-config\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180890 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8800c113-7b51-4554-8e52-c1d0df1a08be-machine-approver-tls\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180917 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-serving-cert\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-service-ca\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.180976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-dir\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181059 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfglr\" (UniqueName: \"kubernetes.io/projected/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-kube-api-access-cfglr\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181090 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-registry-tls\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181145 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee07e18c-9f40-41c3-b2fb-05fd325976e4-serving-cert\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181213 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce4b4e2e-496b-4334-8736-db4f25473731-metrics-tls\") pod \"dns-operator-744455d44c-bhgv8\" (UID: \"ce4b4e2e-496b-4334-8736-db4f25473731\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181259 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5276546c-f731-4bd0-bb93-b5cd19b0992c-serving-cert\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181288 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmzq\" (UniqueName: \"kubernetes.io/projected/17bdd07d-f7e5-47f8-b730-724d5cc8e3d2-kube-api-access-cwmzq\") pod \"downloads-7954f5f757-vswtw\" (UID: \"17bdd07d-f7e5-47f8-b730-724d5cc8e3d2\") " pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181317 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/877de785-bc18-4c1c-970a-1e6533539467-installation-pull-secrets\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181406 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3150273d-63f7-4908-bcc5-2403e123d1e7-metrics-tls\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181432 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9eef7e-3996-45b6-ab7b-50d319dc1117-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s88q\" (UniqueName: \"kubernetes.io/projected/192d0f8f-10f9-43e2-a24a-2019aae0db44-kube-api-access-7s88q\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181515 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmpv\" (UniqueName: \"kubernetes.io/projected/ee07e18c-9f40-41c3-b2fb-05fd325976e4-kube-api-access-smmpv\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181560 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181590 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd05c164-83f7-4ebe-bbe8-9db6707741c5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181618 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181654 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h66t\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-kube-api-access-5h66t\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8800c113-7b51-4554-8e52-c1d0df1a08be-auth-proxy-config\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181709 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/06755863-1a8b-4f4d-a304-03bfd45725ec-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g667l\" (UID: \"06755863-1a8b-4f4d-a304-03bfd45725ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181736 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c9eef7e-3996-45b6-ab7b-50d319dc1117-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181779 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5276546c-f731-4bd0-bb93-b5cd19b0992c-config\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181807 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-trusted-ca-bundle\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181849 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/877de785-bc18-4c1c-970a-1e6533539467-ca-trust-extracted\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181877 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbs2\" (UniqueName: \"kubernetes.io/projected/8800c113-7b51-4554-8e52-c1d0df1a08be-kube-api-access-4lbs2\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181905 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twqnj\" (UniqueName: \"kubernetes.io/projected/06755863-1a8b-4f4d-a304-03bfd45725ec-kube-api-access-twqnj\") pod \"cluster-samples-operator-665b6dd947-g667l\" (UID: \"06755863-1a8b-4f4d-a304-03bfd45725ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181931 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd05c164-83f7-4ebe-bbe8-9db6707741c5-config\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.181956 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.182023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-default-certificate\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.182046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.182096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8800c113-7b51-4554-8e52-c1d0df1a08be-config\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.182122 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-config\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.182325 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-bound-sa-token\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.182365 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.283452 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.283471 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:16.783450997 +0000 UTC m=+142.764060095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.283991 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-serving-cert\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284019 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-service-ca\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284041 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd05529-6d54-416a-8df0-5973ee3179b6-serving-cert\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284065 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfglr\" (UniqueName: \"kubernetes.io/projected/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-kube-api-access-cfglr\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-plugins-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284098 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8vjd\" (UniqueName: \"kubernetes.io/projected/55440e9e-5d99-4244-8b5c-55e2d270313b-kube-api-access-s8vjd\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284116 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-registry-tls\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284136 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284368 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-srv-cert\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5276546c-f731-4bd0-bb93-b5cd19b0992c-serving-cert\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284463 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmzq\" (UniqueName: \"kubernetes.io/projected/17bdd07d-f7e5-47f8-b730-724d5cc8e3d2-kube-api-access-cwmzq\") pod \"downloads-7954f5f757-vswtw\" (UID: \"17bdd07d-f7e5-47f8-b730-724d5cc8e3d2\") " pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284491 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-serving-cert\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284516 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b54ae2-d365-4988-8e69-704574c7962a-secret-volume\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284548 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/877de785-bc18-4c1c-970a-1e6533539467-installation-pull-secrets\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284574 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3150273d-63f7-4908-bcc5-2403e123d1e7-metrics-tls\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284648 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmpv\" (UniqueName: \"kubernetes.io/projected/ee07e18c-9f40-41c3-b2fb-05fd325976e4-kube-api-access-smmpv\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284675 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85774248-1879-439e-9dd2-0d8661c299d6-proxy-tls\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284703 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd05c164-83f7-4ebe-bbe8-9db6707741c5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55440e9e-5d99-4244-8b5c-55e2d270313b-proxy-tls\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284782 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8800c113-7b51-4554-8e52-c1d0df1a08be-auth-proxy-config\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284807 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/06755863-1a8b-4f4d-a304-03bfd45725ec-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g667l\" (UID: \"06755863-1a8b-4f4d-a304-03bfd45725ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284863 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f469e02c-9404-422a-bff0-1b945d9c8768-cert\") pod \"ingress-canary-fw6bh\" (UID: \"f469e02c-9404-422a-bff0-1b945d9c8768\") " pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2scxc\" (UniqueName: \"kubernetes.io/projected/794cdbb9-3392-465a-8a0a-a78a465aee2b-kube-api-access-2scxc\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284930 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg5gf\" (UniqueName: \"kubernetes.io/projected/f469e02c-9404-422a-bff0-1b945d9c8768-kube-api-access-xg5gf\") pod \"ingress-canary-fw6bh\" (UID: \"f469e02c-9404-422a-bff0-1b945d9c8768\") " pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284968 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbkm\" (UniqueName: \"kubernetes.io/projected/4bab7259-25d5-4c53-9ebb-ef2787adf010-kube-api-access-vrbkm\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.284995 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285018 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4bab7259-25d5-4c53-9ebb-ef2787adf010-certs\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b54ae2-d365-4988-8e69-704574c7962a-config-volume\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285049 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-service-ca\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285063 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72xbw\" (UniqueName: \"kubernetes.io/projected/b3eefbc3-6dc4-479c-93e4-94a70fda0f83-kube-api-access-72xbw\") pod \"package-server-manager-789f6589d5-cv2z7\" (UID: \"b3eefbc3-6dc4-479c-93e4-94a70fda0f83\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285110 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-bound-sa-token\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-socket-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285157 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s25qm\" (UniqueName: \"kubernetes.io/projected/59cdf584-81d0-4d66-8fc2-da3a3f995f73-kube-api-access-s25qm\") pod \"migrator-59844c95c7-xqtff\" (UID: \"59cdf584-81d0-4d66-8fc2-da3a3f995f73\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285201 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe33359-31f5-4a6c-93fc-6502d2516335-config\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285247 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-trusted-ca\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285272 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285303 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-policies\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285325 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe33359-31f5-4a6c-93fc-6502d2516335-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285349 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmnrs\" (UniqueName: \"kubernetes.io/projected/dbd05529-6d54-416a-8df0-5973ee3179b6-kube-api-access-kmnrs\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285398 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/06cdc094-b372-4016-bc5e-4c15a28e032e-kube-api-access-l5f6v\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4bab7259-25d5-4c53-9ebb-ef2787adf010-node-bootstrap-token\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01d08621-494b-4232-b678-9caa94e61085-tmpfs\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285502 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4225ddc-bdcd-4158-811b-113234d0c3d0-service-ca-bundle\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285526 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285553 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79859b30-67ee-456b-82e5-f8806347a0b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285583 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z5pj\" (UniqueName: \"kubernetes.io/projected/c4225ddc-bdcd-4158-811b-113234d0c3d0-kube-api-access-5z5pj\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ee07e18c-9f40-41c3-b2fb-05fd325976e4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285646 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg5jv\" (UniqueName: \"kubernetes.io/projected/3150273d-63f7-4908-bcc5-2403e123d1e7-kube-api-access-dg5jv\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j667k\" (UniqueName: \"kubernetes.io/projected/5276546c-f731-4bd0-bb93-b5cd19b0992c-kube-api-access-j667k\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285695 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285719 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6t5\" (UniqueName: \"kubernetes.io/projected/79859b30-67ee-456b-82e5-f8806347a0b9-kube-api-access-9r6t5\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285750 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmgkc\" (UniqueName: \"kubernetes.io/projected/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-kube-api-access-vmgkc\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcp9p\" (UniqueName: \"kubernetes.io/projected/85774248-1879-439e-9dd2-0d8661c299d6-kube-api-access-gcp9p\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285801 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-registry-certificates\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285826 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-stats-auth\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285850 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2zfh\" (UniqueName: \"kubernetes.io/projected/ce4b4e2e-496b-4334-8736-db4f25473731-kube-api-access-k2zfh\") pod \"dns-operator-744455d44c-bhgv8\" (UID: \"ce4b4e2e-496b-4334-8736-db4f25473731\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9eef7e-3996-45b6-ab7b-50d319dc1117-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llq59\" (UniqueName: \"kubernetes.io/projected/c6b54ae2-d365-4988-8e69-704574c7962a-kube-api-access-llq59\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285924 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-mountpoint-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285946 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946946b3-f9e0-45e4-803f-edb3f7218489-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.285976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-csi-data-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286025 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db4a3858-5afa-44c8-a435-2010f7e7340d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5vmlj\" (UID: \"db4a3858-5afa-44c8-a435-2010f7e7340d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286050 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvbbk\" (UniqueName: \"kubernetes.io/projected/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-kube-api-access-xvbbk\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286072 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55440e9e-5d99-4244-8b5c-55e2d270313b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286110 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-oauth-config\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286135 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55440e9e-5d99-4244-8b5c-55e2d270313b-images\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286160 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8800c113-7b51-4554-8e52-c1d0df1a08be-machine-approver-tls\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286202 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-dir\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286228 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-srv-cert\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-service-ca\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286272 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88tz\" (UniqueName: \"kubernetes.io/projected/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-kube-api-access-w88tz\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286295 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946946b3-f9e0-45e4-803f-edb3f7218489-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286320 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee07e18c-9f40-41c3-b2fb-05fd325976e4-serving-cert\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffe33359-31f5-4a6c-93fc-6502d2516335-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286365 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286409 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce4b4e2e-496b-4334-8736-db4f25473731-metrics-tls\") pod \"dns-operator-744455d44c-bhgv8\" (UID: \"ce4b4e2e-496b-4334-8736-db4f25473731\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286438 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s88q\" (UniqueName: \"kubernetes.io/projected/192d0f8f-10f9-43e2-a24a-2019aae0db44-kube-api-access-7s88q\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286462 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9eef7e-3996-45b6-ab7b-50d319dc1117-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286486 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01d08621-494b-4232-b678-9caa94e61085-apiservice-cert\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286488 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-trusted-ca\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286511 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h66t\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-kube-api-access-5h66t\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286535 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c9eef7e-3996-45b6-ab7b-50d319dc1117-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286559 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjbpj\" (UniqueName: \"kubernetes.io/projected/01d08621-494b-4232-b678-9caa94e61085-kube-api-access-mjbpj\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5276546c-f731-4bd0-bb93-b5cd19b0992c-config\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286612 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/877de785-bc18-4c1c-970a-1e6533539467-ca-trust-extracted\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286635 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbs2\" (UniqueName: \"kubernetes.io/projected/8800c113-7b51-4554-8e52-c1d0df1a08be-kube-api-access-4lbs2\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-trusted-ca-bundle\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286681 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twqnj\" (UniqueName: \"kubernetes.io/projected/06755863-1a8b-4f4d-a304-03bfd45725ec-kube-api-access-twqnj\") pod \"cluster-samples-operator-665b6dd947-g667l\" (UID: \"06755863-1a8b-4f4d-a304-03bfd45725ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286705 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd05c164-83f7-4ebe-bbe8-9db6707741c5-config\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286729 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-client\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286731 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286754 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-default-certificate\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286783 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286809 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-config\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286833 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3745a211-9fa8-41a7-aa26-d733431bc9aa-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-24dmv\" (UID: \"3745a211-9fa8-41a7-aa26-d733431bc9aa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286859 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8800c113-7b51-4554-8e52-c1d0df1a08be-config\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-config\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286905 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.286976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvb9h\" (UniqueName: \"kubernetes.io/projected/c8ebf890-c3b0-468e-bf7d-0ec590df084b-kube-api-access-vvb9h\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287006 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eefbc3-6dc4-479c-93e4-94a70fda0f83-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cv2z7\" (UID: \"b3eefbc3-6dc4-479c-93e4-94a70fda0f83\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.287053 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:16.78704049 +0000 UTC m=+142.767649588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287081 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85774248-1879-439e-9dd2-0d8661c299d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-profile-collector-cert\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287122 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzg8\" (UniqueName: \"kubernetes.io/projected/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-kube-api-access-jvzg8\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287140 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxscw\" (UniqueName: \"kubernetes.io/projected/3745a211-9fa8-41a7-aa26-d733431bc9aa-kube-api-access-sxscw\") pod \"multus-admission-controller-857f4d67dd-24dmv\" (UID: \"3745a211-9fa8-41a7-aa26-d733431bc9aa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287193 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287230 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd05c164-83f7-4ebe-bbe8-9db6707741c5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287248 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5cbs\" (UniqueName: \"kubernetes.io/projected/855486b0-11f8-4ff0-930d-75c7e9d790d3-kube-api-access-m5cbs\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287267 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-registration-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287284 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6df\" (UniqueName: \"kubernetes.io/projected/db4a3858-5afa-44c8-a435-2010f7e7340d-kube-api-access-9v6df\") pod \"control-plane-machine-set-operator-78cbb6b69f-5vmlj\" (UID: \"db4a3858-5afa-44c8-a435-2010f7e7340d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287303 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-oauth-serving-cert\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287326 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/192d0f8f-10f9-43e2-a24a-2019aae0db44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287350 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-metrics-certs\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287376 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192d0f8f-10f9-43e2-a24a-2019aae0db44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287398 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79859b30-67ee-456b-82e5-f8806347a0b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287423 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287478 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mswpx\" (UniqueName: \"kubernetes.io/projected/2c9eef7e-3996-45b6-ab7b-50d319dc1117-kube-api-access-mswpx\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287503 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/946946b3-f9e0-45e4-803f-edb3f7218489-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3150273d-63f7-4908-bcc5-2403e123d1e7-config-volume\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287547 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/855486b0-11f8-4ff0-930d-75c7e9d790d3-signing-key\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287570 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-ca\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287590 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd05529-6d54-416a-8df0-5973ee3179b6-config\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287618 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqp4\" (UniqueName: \"kubernetes.io/projected/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-kube-api-access-ttqp4\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01d08621-494b-4232-b678-9caa94e61085-webhook-cert\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5276546c-f731-4bd0-bb93-b5cd19b0992c-trusted-ca\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287703 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.287723 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/855486b0-11f8-4ff0-930d-75c7e9d790d3-signing-cabundle\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.288574 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4225ddc-bdcd-4158-811b-113234d0c3d0-service-ca-bundle\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.289503 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.289787 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/06755863-1a8b-4f4d-a304-03bfd45725ec-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-g667l\" (UID: \"06755863-1a8b-4f4d-a304-03bfd45725ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.290129 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8800c113-7b51-4554-8e52-c1d0df1a08be-auth-proxy-config\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.290269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/ee07e18c-9f40-41c3-b2fb-05fd325976e4-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.290628 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.291437 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-policies\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.291935 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-oauth-serving-cert\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.292047 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3150273d-63f7-4908-bcc5-2403e123d1e7-config-volume\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.292641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-trusted-ca\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.293353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.293903 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-registry-tls\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.294054 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/192d0f8f-10f9-43e2-a24a-2019aae0db44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.294281 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3150273d-63f7-4908-bcc5-2403e123d1e7-metrics-tls\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.295181 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee07e18c-9f40-41c3-b2fb-05fd325976e4-serving-cert\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.297904 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-registry-certificates\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.299274 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.299784 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.300237 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.300317 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-dir\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.301089 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd05c164-83f7-4ebe-bbe8-9db6707741c5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.301143 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c9eef7e-3996-45b6-ab7b-50d319dc1117-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.301793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-metrics-certs\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.301939 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-config\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.302385 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-stats-auth\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.302703 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192d0f8f-10f9-43e2-a24a-2019aae0db44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.303123 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/877de785-bc18-4c1c-970a-1e6533539467-ca-trust-extracted\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.303389 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.303414 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8800c113-7b51-4554-8e52-c1d0df1a08be-config\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.304525 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd05c164-83f7-4ebe-bbe8-9db6707741c5-config\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.304763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5276546c-f731-4bd0-bb93-b5cd19b0992c-config\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.305727 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5276546c-f731-4bd0-bb93-b5cd19b0992c-serving-cert\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.305741 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5276546c-f731-4bd0-bb93-b5cd19b0992c-trusted-ca\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.306792 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8800c113-7b51-4554-8e52-c1d0df1a08be-machine-approver-tls\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.307222 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c4225ddc-bdcd-4158-811b-113234d0c3d0-default-certificate\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.308256 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/2c9eef7e-3996-45b6-ab7b-50d319dc1117-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.308492 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce4b4e2e-496b-4334-8736-db4f25473731-metrics-tls\") pod \"dns-operator-744455d44c-bhgv8\" (UID: \"ce4b4e2e-496b-4334-8736-db4f25473731\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.308582 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-metrics-tls\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.309019 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.309187 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-trusted-ca-bundle\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.309345 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/877de785-bc18-4c1c-970a-1e6533539467-installation-pull-secrets\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.310457 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-serving-cert\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.310694 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.314426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.321778 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-oauth-config\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.323949 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfglr\" (UniqueName: \"kubernetes.io/projected/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-kube-api-access-cfglr\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.342062 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd05c164-83f7-4ebe-bbe8-9db6707741c5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mzkn2\" (UID: \"dd05c164-83f7-4ebe-bbe8-9db6707741c5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.360533 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.366368 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmzq\" (UniqueName: \"kubernetes.io/projected/17bdd07d-f7e5-47f8-b730-724d5cc8e3d2-kube-api-access-cwmzq\") pod \"downloads-7954f5f757-vswtw\" (UID: \"17bdd07d-f7e5-47f8-b730-724d5cc8e3d2\") " pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.379612 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmpv\" (UniqueName: \"kubernetes.io/projected/ee07e18c-9f40-41c3-b2fb-05fd325976e4-kube-api-access-smmpv\") pod \"openshift-config-operator-7777fb866f-bck4j\" (UID: \"ee07e18c-9f40-41c3-b2fb-05fd325976e4\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390316 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.390474 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:16.890450807 +0000 UTC m=+142.871059905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5cbs\" (UniqueName: \"kubernetes.io/projected/855486b0-11f8-4ff0-930d-75c7e9d790d3-kube-api-access-m5cbs\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390561 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-registration-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390582 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6df\" (UniqueName: \"kubernetes.io/projected/db4a3858-5afa-44c8-a435-2010f7e7340d-kube-api-access-9v6df\") pod \"control-plane-machine-set-operator-78cbb6b69f-5vmlj\" (UID: \"db4a3858-5afa-44c8-a435-2010f7e7340d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxscw\" (UniqueName: \"kubernetes.io/projected/3745a211-9fa8-41a7-aa26-d733431bc9aa-kube-api-access-sxscw\") pod \"multus-admission-controller-857f4d67dd-24dmv\" (UID: \"3745a211-9fa8-41a7-aa26-d733431bc9aa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390630 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79859b30-67ee-456b-82e5-f8806347a0b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390652 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390681 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/946946b3-f9e0-45e4-803f-edb3f7218489-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390703 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/855486b0-11f8-4ff0-930d-75c7e9d790d3-signing-key\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390722 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-ca\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390742 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd05529-6d54-416a-8df0-5973ee3179b6-config\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390773 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttqp4\" (UniqueName: \"kubernetes.io/projected/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-kube-api-access-ttqp4\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390793 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01d08621-494b-4232-b678-9caa94e61085-webhook-cert\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/855486b0-11f8-4ff0-930d-75c7e9d790d3-signing-cabundle\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390836 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd05529-6d54-416a-8df0-5973ee3179b6-serving-cert\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-plugins-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390875 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8vjd\" (UniqueName: \"kubernetes.io/projected/55440e9e-5d99-4244-8b5c-55e2d270313b-kube-api-access-s8vjd\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390924 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-srv-cert\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-serving-cert\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b54ae2-d365-4988-8e69-704574c7962a-secret-volume\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.390985 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85774248-1879-439e-9dd2-0d8661c299d6-proxy-tls\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391011 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f469e02c-9404-422a-bff0-1b945d9c8768-cert\") pod \"ingress-canary-fw6bh\" (UID: \"f469e02c-9404-422a-bff0-1b945d9c8768\") " pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391031 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55440e9e-5d99-4244-8b5c-55e2d270313b-proxy-tls\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391053 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg5gf\" (UniqueName: \"kubernetes.io/projected/f469e02c-9404-422a-bff0-1b945d9c8768-kube-api-access-xg5gf\") pod \"ingress-canary-fw6bh\" (UID: \"f469e02c-9404-422a-bff0-1b945d9c8768\") " pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391086 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2scxc\" (UniqueName: \"kubernetes.io/projected/794cdbb9-3392-465a-8a0a-a78a465aee2b-kube-api-access-2scxc\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbkm\" (UniqueName: \"kubernetes.io/projected/4bab7259-25d5-4c53-9ebb-ef2787adf010-kube-api-access-vrbkm\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391134 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4bab7259-25d5-4c53-9ebb-ef2787adf010-certs\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391154 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b54ae2-d365-4988-8e69-704574c7962a-config-volume\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391196 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-socket-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391218 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s25qm\" (UniqueName: \"kubernetes.io/projected/59cdf584-81d0-4d66-8fc2-da3a3f995f73-kube-api-access-s25qm\") pod \"migrator-59844c95c7-xqtff\" (UID: \"59cdf584-81d0-4d66-8fc2-da3a3f995f73\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391240 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72xbw\" (UniqueName: \"kubernetes.io/projected/b3eefbc3-6dc4-479c-93e4-94a70fda0f83-kube-api-access-72xbw\") pod \"package-server-manager-789f6589d5-cv2z7\" (UID: \"b3eefbc3-6dc4-479c-93e4-94a70fda0f83\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391269 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe33359-31f5-4a6c-93fc-6502d2516335-config\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391291 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391314 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe33359-31f5-4a6c-93fc-6502d2516335-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391337 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmnrs\" (UniqueName: \"kubernetes.io/projected/dbd05529-6d54-416a-8df0-5973ee3179b6-kube-api-access-kmnrs\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391359 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4bab7259-25d5-4c53-9ebb-ef2787adf010-node-bootstrap-token\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391377 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01d08621-494b-4232-b678-9caa94e61085-tmpfs\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79859b30-67ee-456b-82e5-f8806347a0b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391477 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6t5\" (UniqueName: \"kubernetes.io/projected/79859b30-67ee-456b-82e5-f8806347a0b9-kube-api-access-9r6t5\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391512 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcp9p\" (UniqueName: \"kubernetes.io/projected/85774248-1879-439e-9dd2-0d8661c299d6-kube-api-access-gcp9p\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llq59\" (UniqueName: \"kubernetes.io/projected/c6b54ae2-d365-4988-8e69-704574c7962a-kube-api-access-llq59\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-mountpoint-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391589 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946946b3-f9e0-45e4-803f-edb3f7218489-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-csi-data-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391633 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db4a3858-5afa-44c8-a435-2010f7e7340d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5vmlj\" (UID: \"db4a3858-5afa-44c8-a435-2010f7e7340d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvbbk\" (UniqueName: \"kubernetes.io/projected/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-kube-api-access-xvbbk\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391679 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55440e9e-5d99-4244-8b5c-55e2d270313b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55440e9e-5d99-4244-8b5c-55e2d270313b-images\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-srv-cert\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391754 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-service-ca\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391775 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88tz\" (UniqueName: \"kubernetes.io/projected/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-kube-api-access-w88tz\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391794 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946946b3-f9e0-45e4-803f-edb3f7218489-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391815 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffe33359-31f5-4a6c-93fc-6502d2516335-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391838 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391866 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01d08621-494b-4232-b678-9caa94e61085-apiservice-cert\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391897 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjbpj\" (UniqueName: \"kubernetes.io/projected/01d08621-494b-4232-b678-9caa94e61085-kube-api-access-mjbpj\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391940 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-config\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391960 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-client\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.391983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3745a211-9fa8-41a7-aa26-d733431bc9aa-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-24dmv\" (UID: \"3745a211-9fa8-41a7-aa26-d733431bc9aa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392010 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392032 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvb9h\" (UniqueName: \"kubernetes.io/projected/c8ebf890-c3b0-468e-bf7d-0ec590df084b-kube-api-access-vvb9h\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392066 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eefbc3-6dc4-479c-93e4-94a70fda0f83-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cv2z7\" (UID: \"b3eefbc3-6dc4-479c-93e4-94a70fda0f83\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392088 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85774248-1879-439e-9dd2-0d8661c299d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392112 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-profile-collector-cert\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392132 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392157 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzg8\" (UniqueName: \"kubernetes.io/projected/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-kube-api-access-jvzg8\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd05529-6d54-416a-8df0-5973ee3179b6-config\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.392682 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-registration-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.393007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe33359-31f5-4a6c-93fc-6502d2516335-config\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.393322 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79859b30-67ee-456b-82e5-f8806347a0b9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.393625 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.393964 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-mountpoint-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.394094 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-csi-data-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.394421 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4bab7259-25d5-4c53-9ebb-ef2787adf010-certs\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.395041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b54ae2-d365-4988-8e69-704574c7962a-config-volume\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.395112 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-socket-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.396519 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f469e02c-9404-422a-bff0-1b945d9c8768-cert\") pod \"ingress-canary-fw6bh\" (UID: \"f469e02c-9404-422a-bff0-1b945d9c8768\") " pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.396658 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/85774248-1879-439e-9dd2-0d8661c299d6-proxy-tls\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.397430 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/db4a3858-5afa-44c8-a435-2010f7e7340d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5vmlj\" (UID: \"db4a3858-5afa-44c8-a435-2010f7e7340d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.397548 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55440e9e-5d99-4244-8b5c-55e2d270313b-proxy-tls\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.397893 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/855486b0-11f8-4ff0-930d-75c7e9d790d3-signing-cabundle\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.397976 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:16.897962754 +0000 UTC m=+142.878571852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.398587 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/01d08621-494b-4232-b678-9caa94e61085-tmpfs\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399132 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/55440e9e-5d99-4244-8b5c-55e2d270313b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399162 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/06cdc094-b372-4016-bc5e-4c15a28e032e-kube-api-access-l5f6v\") pod \"oauth-openshift-558db77b4-fgw98\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399225 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/794cdbb9-3392-465a-8a0a-a78a465aee2b-plugins-dir\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399405 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/855486b0-11f8-4ff0-930d-75c7e9d790d3-signing-key\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/01d08621-494b-4232-b678-9caa94e61085-apiservice-cert\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/55440e9e-5d99-4244-8b5c-55e2d270313b-images\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.399830 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/85774248-1879-439e-9dd2-0d8661c299d6-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.400303 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79859b30-67ee-456b-82e5-f8806347a0b9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.400961 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/01d08621-494b-4232-b678-9caa94e61085-webhook-cert\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.401299 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/b3eefbc3-6dc4-479c-93e4-94a70fda0f83-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cv2z7\" (UID: \"b3eefbc3-6dc4-479c-93e4-94a70fda0f83\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.401399 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-srv-cert\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.401794 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.402567 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.402647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd05529-6d54-416a-8df0-5973ee3179b6-serving-cert\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.403055 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-client\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.403220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-service-ca\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.403672 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-config\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.404030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/946946b3-f9e0-45e4-803f-edb3f7218489-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.404117 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-srv-cert\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.404325 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-etcd-ca\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.404894 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-profile-collector-cert\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.405124 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3745a211-9fa8-41a7-aa26-d733431bc9aa-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-24dmv\" (UID: \"3745a211-9fa8-41a7-aa26-d733431bc9aa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.405361 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-serving-cert\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.405909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b54ae2-d365-4988-8e69-704574c7962a-secret-volume\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.406412 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.406458 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.407043 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe33359-31f5-4a6c-93fc-6502d2516335-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.408322 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4bab7259-25d5-4c53-9ebb-ef2787adf010-node-bootstrap-token\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.408568 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/946946b3-f9e0-45e4-803f-edb3f7218489-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.422512 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z5pj\" (UniqueName: \"kubernetes.io/projected/c4225ddc-bdcd-4158-811b-113234d0c3d0-kube-api-access-5z5pj\") pod \"router-default-5444994796-7k7sg\" (UID: \"c4225ddc-bdcd-4158-811b-113234d0c3d0\") " pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.438664 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f9ab8468-b920-41e9-a5f2-1af70f1b5ffd-bound-sa-token\") pod \"ingress-operator-5b745b69d9-w4j4h\" (UID: \"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.444116 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.472262 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg5jv\" (UniqueName: \"kubernetes.io/projected/3150273d-63f7-4908-bcc5-2403e123d1e7-kube-api-access-dg5jv\") pod \"dns-default-jplbk\" (UID: \"3150273d-63f7-4908-bcc5-2403e123d1e7\") " pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.480366 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j667k\" (UniqueName: \"kubernetes.io/projected/5276546c-f731-4bd0-bb93-b5cd19b0992c-kube-api-access-j667k\") pod \"console-operator-58897d9998-npths\" (UID: \"5276546c-f731-4bd0-bb93-b5cd19b0992c\") " pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.492985 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.493465 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:16.993450821 +0000 UTC m=+142.974059919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.500827 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mswpx\" (UniqueName: \"kubernetes.io/projected/2c9eef7e-3996-45b6-ab7b-50d319dc1117-kube-api-access-mswpx\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.518545 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.524154 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmgkc\" (UniqueName: \"kubernetes.io/projected/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-kube-api-access-vmgkc\") pod \"console-f9d7485db-7qfrl\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.531482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2"] Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.539550 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.543444 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2zfh\" (UniqueName: \"kubernetes.io/projected/ce4b4e2e-496b-4334-8736-db4f25473731-kube-api-access-k2zfh\") pod \"dns-operator-744455d44c-bhgv8\" (UID: \"ce4b4e2e-496b-4334-8736-db4f25473731\") " pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.561894 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c9eef7e-3996-45b6-ab7b-50d319dc1117-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4v228\" (UID: \"2c9eef7e-3996-45b6-ab7b-50d319dc1117\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.576487 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.581106 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s88q\" (UniqueName: \"kubernetes.io/projected/192d0f8f-10f9-43e2-a24a-2019aae0db44-kube-api-access-7s88q\") pod \"openshift-apiserver-operator-796bbdcf4f-zq27x\" (UID: \"192d0f8f-10f9-43e2-a24a-2019aae0db44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.589660 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.594963 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.595376 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.095359944 +0000 UTC m=+143.075969052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.610763 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h66t\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-kube-api-access-5h66t\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.613493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.619292 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.627199 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.630441 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-bound-sa-token\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.636014 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" event={"ID":"dd05c164-83f7-4ebe-bbe8-9db6707741c5","Type":"ContainerStarted","Data":"68665c93b188501aa2d7df5f5af86d404704d1403f6fe10a4093685c7a6ee1cf"} Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.638552 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.639580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" event={"ID":"625f7e2d-0e3f-4c2c-8f49-b09fc3638536","Type":"ContainerStarted","Data":"b28cd8c3fe65bb44ff5ed3705cd139aedd1d6522b0d06267a643bc732701558f"} Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.641480 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" event={"ID":"39dd090e-b988-4c36-88f0-c0cb28a23e8b","Type":"ContainerStarted","Data":"257d4dbd74cf09cb130ad2e955eeb1266c017333a47677c8b512b2f5edde9667"} Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.641505 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" event={"ID":"39dd090e-b988-4c36-88f0-c0cb28a23e8b","Type":"ContainerStarted","Data":"0a778ca4bceec9fd16b9c8f9ec95ee3fbacfad7d71ea4a992fe5cb953899bd2d"} Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.647057 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.649340 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twqnj\" (UniqueName: \"kubernetes.io/projected/06755863-1a8b-4f4d-a304-03bfd45725ec-kube-api-access-twqnj\") pod \"cluster-samples-operator-665b6dd947-g667l\" (UID: \"06755863-1a8b-4f4d-a304-03bfd45725ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.654431 4772 generic.go:334] "Generic (PLEG): container finished" podID="34e7a553-e424-472e-a143-76e7e08e57aa" containerID="d15bf5cf293ed09673e38bdec1e04909a7c8924975f8f59b2ddf45cd1d0c265e" exitCode=0 Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.654579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" event={"ID":"34e7a553-e424-472e-a143-76e7e08e57aa","Type":"ContainerDied","Data":"d15bf5cf293ed09673e38bdec1e04909a7c8924975f8f59b2ddf45cd1d0c265e"} Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.664861 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbs2\" (UniqueName: \"kubernetes.io/projected/8800c113-7b51-4554-8e52-c1d0df1a08be-kube-api-access-4lbs2\") pod \"machine-approver-56656f9798-klfsg\" (UID: \"8800c113-7b51-4554-8e52-c1d0df1a08be\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.696401 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.697016 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.196980169 +0000 UTC m=+143.177589267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.697256 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.697274 4772 generic.go:334] "Generic (PLEG): container finished" podID="d1aba7eb-5916-4023-90f2-10152ad89b63" containerID="5a300fb78fa1e933a3df435d7e3452e503aa76c100098936d48e100eae063049" exitCode=0 Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.697626 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.197614567 +0000 UTC m=+143.178223665 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.700348 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" event={"ID":"d1aba7eb-5916-4023-90f2-10152ad89b63","Type":"ContainerDied","Data":"5a300fb78fa1e933a3df435d7e3452e503aa76c100098936d48e100eae063049"} Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.700381 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bck4j"] Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.711732 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.713454 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.720025 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72xbw\" (UniqueName: \"kubernetes.io/projected/b3eefbc3-6dc4-479c-93e4-94a70fda0f83-kube-api-access-72xbw\") pod \"package-server-manager-789f6589d5-cv2z7\" (UID: \"b3eefbc3-6dc4-479c-93e4-94a70fda0f83\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.737595 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbkm\" (UniqueName: \"kubernetes.io/projected/4bab7259-25d5-4c53-9ebb-ef2787adf010-kube-api-access-vrbkm\") pod \"machine-config-server-58dhr\" (UID: \"4bab7259-25d5-4c53-9ebb-ef2787adf010\") " pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.737845 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.746840 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzg8\" (UniqueName: \"kubernetes.io/projected/c66d8c7d-2de8-492f-ba5e-7ff0e236bf64-kube-api-access-jvzg8\") pod \"kube-storage-version-migrator-operator-b67b599dd-zkmjj\" (UID: \"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.752975 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.761627 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.763274 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5cbs\" (UniqueName: \"kubernetes.io/projected/855486b0-11f8-4ff0-930d-75c7e9d790d3-kube-api-access-m5cbs\") pod \"service-ca-9c57cc56f-9wv77\" (UID: \"855486b0-11f8-4ff0-930d-75c7e9d790d3\") " pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.776572 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.785482 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6df\" (UniqueName: \"kubernetes.io/projected/db4a3858-5afa-44c8-a435-2010f7e7340d-kube-api-access-9v6df\") pod \"control-plane-machine-set-operator-78cbb6b69f-5vmlj\" (UID: \"db4a3858-5afa-44c8-a435-2010f7e7340d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.797964 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.799154 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.299133809 +0000 UTC m=+143.279742907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.822099 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-58dhr" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.822752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6t5\" (UniqueName: \"kubernetes.io/projected/79859b30-67ee-456b-82e5-f8806347a0b9-kube-api-access-9r6t5\") pod \"openshift-controller-manager-operator-756b6f6bc6-c6n8m\" (UID: \"79859b30-67ee-456b-82e5-f8806347a0b9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.833730 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxscw\" (UniqueName: \"kubernetes.io/projected/3745a211-9fa8-41a7-aa26-d733431bc9aa-kube-api-access-sxscw\") pod \"multus-admission-controller-857f4d67dd-24dmv\" (UID: \"3745a211-9fa8-41a7-aa26-d733431bc9aa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.864574 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-npths"] Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.865559 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.892028 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcp9p\" (UniqueName: \"kubernetes.io/projected/85774248-1879-439e-9dd2-0d8661c299d6-kube-api-access-gcp9p\") pod \"machine-config-controller-84d6567774-v7l4k\" (UID: \"85774248-1879-439e-9dd2-0d8661c299d6\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.897128 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llq59\" (UniqueName: \"kubernetes.io/projected/c6b54ae2-d365-4988-8e69-704574c7962a-kube-api-access-llq59\") pod \"collect-profiles-29492100-r2zj6\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.899725 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/946946b3-f9e0-45e4-803f-edb3f7218489-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f9fqs\" (UID: \"946946b3-f9e0-45e4-803f-edb3f7218489\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.900300 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:16 crc kubenswrapper[4772]: E0127 15:09:16.900867 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.400853066 +0000 UTC m=+143.381462164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.919206 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s25qm\" (UniqueName: \"kubernetes.io/projected/59cdf584-81d0-4d66-8fc2-da3a3f995f73-kube-api-access-s25qm\") pod \"migrator-59844c95c7-xqtff\" (UID: \"59cdf584-81d0-4d66-8fc2-da3a3f995f73\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.937248 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7qfrl"] Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.954445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg5gf\" (UniqueName: \"kubernetes.io/projected/f469e02c-9404-422a-bff0-1b945d9c8768-kube-api-access-xg5gf\") pod \"ingress-canary-fw6bh\" (UID: \"f469e02c-9404-422a-bff0-1b945d9c8768\") " pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.968002 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.977621 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.981933 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2scxc\" (UniqueName: \"kubernetes.io/projected/794cdbb9-3392-465a-8a0a-a78a465aee2b-kube-api-access-2scxc\") pod \"csi-hostpathplugin-hbbxh\" (UID: \"794cdbb9-3392-465a-8a0a-a78a465aee2b\") " pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.985496 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.988354 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8vjd\" (UniqueName: \"kubernetes.io/projected/55440e9e-5d99-4244-8b5c-55e2d270313b-kube-api-access-s8vjd\") pod \"machine-config-operator-74547568cd-njx6w\" (UID: \"55440e9e-5d99-4244-8b5c-55e2d270313b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:16 crc kubenswrapper[4772]: I0127 15:09:16.988579 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttqp4\" (UniqueName: \"kubernetes.io/projected/01d2c3f9-778c-4cf0-b8a4-76583f62df3c-kube-api-access-ttqp4\") pod \"olm-operator-6b444d44fb-jdcpn\" (UID: \"01d2c3f9-778c-4cf0-b8a4-76583f62df3c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.001482 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.001853 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.501837623 +0000 UTC m=+143.482446731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.001996 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.010218 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvbbk\" (UniqueName: \"kubernetes.io/projected/1f03647c-a3e9-4099-9780-e79e3a4d4cf2-kube-api-access-xvbbk\") pod \"catalog-operator-68c6474976-mnltb\" (UID: \"1f03647c-a3e9-4099-9780-e79e3a4d4cf2\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.021376 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvb9h\" (UniqueName: \"kubernetes.io/projected/c8ebf890-c3b0-468e-bf7d-0ec590df084b-kube-api-access-vvb9h\") pod \"marketplace-operator-79b997595-4lj2h\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.027764 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.039522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fw6bh" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.046270 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.060665 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.067148 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjbpj\" (UniqueName: \"kubernetes.io/projected/01d08621-494b-4232-b678-9caa94e61085-kube-api-access-mjbpj\") pod \"packageserver-d55dfcdfc-cf6v7\" (UID: \"01d08621-494b-4232-b678-9caa94e61085\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.068537 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.063113 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88tz\" (UniqueName: \"kubernetes.io/projected/c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3-kube-api-access-w88tz\") pod \"etcd-operator-b45778765-pwmhd\" (UID: \"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.089216 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.092295 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.097221 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffe33359-31f5-4a6c-93fc-6502d2516335-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fbwkz\" (UID: \"ffe33359-31f5-4a6c-93fc-6502d2516335\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.102807 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.104092 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.104644 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.604634351 +0000 UTC m=+143.585243449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.114319 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.114671 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.115255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmnrs\" (UniqueName: \"kubernetes.io/projected/dbd05529-6d54-416a-8df0-5973ee3179b6-kube-api-access-kmnrs\") pod \"service-ca-operator-777779d784-djmb4\" (UID: \"dbd05529-6d54-416a-8df0-5973ee3179b6\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.144378 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.191018 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-vswtw"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.204758 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.205071 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.705056981 +0000 UTC m=+143.685666079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.267403 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.296329 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.310937 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.311846 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.312207 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.812192595 +0000 UTC m=+143.792801693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.317644 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.387108 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bhgv8"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.411136 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fgw98"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.413203 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228"] Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.413854 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.91383787 +0000 UTC m=+143.894446968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.413684 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.414340 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.414620 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:17.914613233 +0000 UTC m=+143.895222331 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.435900 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.476859 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jplbk"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.497365 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.517001 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.517564 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.017544655 +0000 UTC m=+143.998153753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.629800 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.631822 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.131791204 +0000 UTC m=+144.112400292 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.632285 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9wv77"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.633288 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l"] Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.711677 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" event={"ID":"ce4b4e2e-496b-4334-8736-db4f25473731","Type":"ContainerStarted","Data":"591f6838813fcb88aeef0cd6093d0e0bcd3252c2d5c3d310e4d6567aed6039c2"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.714051 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" event={"ID":"06cdc094-b372-4016-bc5e-4c15a28e032e","Type":"ContainerStarted","Data":"3982e848b6f4ab4d0d2958e425dd1a480bb7b8b136363856076bf9ce68e097fb"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.722384 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-58dhr" event={"ID":"4bab7259-25d5-4c53-9ebb-ef2787adf010","Type":"ContainerStarted","Data":"e98f41d2510b33d0698dbd7ae4d039343cceadb5ae237cf925b7e7a8d5435dc6"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.737312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.738874 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7k7sg" event={"ID":"c4225ddc-bdcd-4158-811b-113234d0c3d0","Type":"ContainerStarted","Data":"25cd0105e9828132acd6b8f37dcf02f4c7acf6955ff8d08ef76b9c80aa284d6e"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.738923 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7k7sg" event={"ID":"c4225ddc-bdcd-4158-811b-113234d0c3d0","Type":"ContainerStarted","Data":"ea4583bad816e9577db96aec74ce06766b98235a7e5195bb6afc2a37cf197924"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.744333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" event={"ID":"2c9eef7e-3996-45b6-ab7b-50d319dc1117","Type":"ContainerStarted","Data":"0771b389042915c69da1c41aa8f21b3569cbce61ac4e69d6456d7e18da8f069f"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.745377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-npths" event={"ID":"5276546c-f731-4bd0-bb93-b5cd19b0992c","Type":"ContainerStarted","Data":"2bbe0cbce3f8ea6011ebf6dcbfd30782c40d8ce69552ccff0de447ef0b35edb8"} Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.745563 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.245533799 +0000 UTC m=+144.226142907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.746055 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vswtw" event={"ID":"17bdd07d-f7e5-47f8-b730-724d5cc8e3d2","Type":"ContainerStarted","Data":"01a9119414207291aef3e1db49c29878d281b57cf66ccb2596ca13d4b5131174"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.748009 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7qfrl" event={"ID":"e2e31e5f-3a41-42f5-90b0-99c05a8033a6","Type":"ContainerStarted","Data":"689105dc82b6dcc122fad60678c44aee714f4e2b250e67f0c76903dd34d0b5c3"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.748700 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" event={"ID":"8800c113-7b51-4554-8e52-c1d0df1a08be","Type":"ContainerStarted","Data":"5da0fe9c33333bb873c2a165867fd7f8badf46f60b030c054559d9f8586f985a"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.749674 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" event={"ID":"192d0f8f-10f9-43e2-a24a-2019aae0db44","Type":"ContainerStarted","Data":"f67ee9130f52502f0ec03d9b0c5b1493ac32c1219f9e65c73c2fc6b7e0fc3336"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.768235 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" event={"ID":"34e7a553-e424-472e-a143-76e7e08e57aa","Type":"ContainerStarted","Data":"2cae3b4aa67dc5e6a63a0bb18d2ab08e5b6efd969347371e8159f04131f1c67c"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.780203 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" event={"ID":"dd05c164-83f7-4ebe-bbe8-9db6707741c5","Type":"ContainerStarted","Data":"0093a9aed6c8a4572b3b2643fc6ead21c9a27ea9f74c29c40bfc2795ebe6034e"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.795305 4772 generic.go:334] "Generic (PLEG): container finished" podID="ee07e18c-9f40-41c3-b2fb-05fd325976e4" containerID="b7ce8aeca0ab242b2614d3530a13b679f4b6cbfc2d58f53b6ac82ac0eae6ad31" exitCode=0 Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.798375 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" event={"ID":"ee07e18c-9f40-41c3-b2fb-05fd325976e4","Type":"ContainerDied","Data":"b7ce8aeca0ab242b2614d3530a13b679f4b6cbfc2d58f53b6ac82ac0eae6ad31"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.798410 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" event={"ID":"ee07e18c-9f40-41c3-b2fb-05fd325976e4","Type":"ContainerStarted","Data":"401d72ccb1e730289431212d99b14989550a7ab785fccf9bb2d9a54765a06a2d"} Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.842945 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.843330 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.343314463 +0000 UTC m=+144.323923561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.924485 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mfh29" podStartSLOduration=120.924469446 podStartE2EDuration="2m0.924469446s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:17.924299351 +0000 UTC m=+143.904908449" watchObservedRunningTime="2026-01-27 15:09:17.924469446 +0000 UTC m=+143.905078544" Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.946013 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.946288 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.446231425 +0000 UTC m=+144.426840533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:17 crc kubenswrapper[4772]: I0127 15:09:17.946922 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:17 crc kubenswrapper[4772]: E0127 15:09:17.953711 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.45369719 +0000 UTC m=+144.434306288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.047906 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.048289 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.548273981 +0000 UTC m=+144.528883079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.148973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.149603 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.649589957 +0000 UTC m=+144.630199055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.244027 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.247232 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.252977 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.253282 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.753264071 +0000 UTC m=+144.733873169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.306508 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.361449 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.362449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.363532 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.863520514 +0000 UTC m=+144.844129612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.433730 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-tgmck" podStartSLOduration=122.433709101 podStartE2EDuration="2m2.433709101s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:18.420047026 +0000 UTC m=+144.400656124" watchObservedRunningTime="2026-01-27 15:09:18.433709101 +0000 UTC m=+144.414318199" Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.465108 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.465257 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.965235312 +0000 UTC m=+144.945844410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.465417 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.465741 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:18.965726106 +0000 UTC m=+144.946335204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.505938 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" podStartSLOduration=121.505923997 podStartE2EDuration="2m1.505923997s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:18.505463453 +0000 UTC m=+144.486072551" watchObservedRunningTime="2026-01-27 15:09:18.505923997 +0000 UTC m=+144.486533095" Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.570702 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.571084 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.071068898 +0000 UTC m=+145.051677996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.615498 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" podStartSLOduration=122.615480881 podStartE2EDuration="2m2.615480881s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:18.613609246 +0000 UTC m=+144.594218364" watchObservedRunningTime="2026-01-27 15:09:18.615480881 +0000 UTC m=+144.596089979" Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.651215 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.671737 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.671798 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.673149 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.673508 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.173494126 +0000 UTC m=+145.154103224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.782415 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.782872 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.282857724 +0000 UTC m=+145.263466832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.895484 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:18 crc kubenswrapper[4772]: E0127 15:09:18.895925 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.395910739 +0000 UTC m=+145.376519837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.945909 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" event={"ID":"b3eefbc3-6dc4-479c-93e4-94a70fda0f83","Type":"ContainerStarted","Data":"6b86debb441021b003508513ead164bde4f10f62cc65635db9f5f4a1885e9223"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946336 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946353 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" event={"ID":"85774248-1879-439e-9dd2-0d8661c299d6","Type":"ContainerStarted","Data":"97aa3ed61e762c362dda1f72b62105466808bca994d182bf11015ff53163c53a"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" event={"ID":"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64","Type":"ContainerStarted","Data":"ef588de4253172ca4c9ff7d80a68abed8f0bb2c9628e6b73579aaf45a29f1da7"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" event={"ID":"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd","Type":"ContainerStarted","Data":"e585c3d7dc35ef0921dfcf5d7a69da5c2136ec4e226763f0534e80b6e7fc26a1"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946385 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fw6bh"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946396 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4lj2h"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946409 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-58dhr" event={"ID":"4bab7259-25d5-4c53-9ebb-ef2787adf010","Type":"ContainerStarted","Data":"6686f6e3b763d8ff3782aa8904c0077068565aa5e1027885a9ec0f54ceffe277"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946422 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" event={"ID":"855486b0-11f8-4ff0-930d-75c7e9d790d3","Type":"ContainerStarted","Data":"2716f89b59875013e93f2fbf23b4377a3b3341f577d15221a25e0df7535fd5b6"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946434 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946445 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" event={"ID":"06755863-1a8b-4f4d-a304-03bfd45725ec","Type":"ContainerStarted","Data":"27109b40b600a3fae2adf6265056a955275795e1927236d4e8a26d80dedb3184"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946457 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7qfrl" event={"ID":"e2e31e5f-3a41-42f5-90b0-99c05a8033a6","Type":"ContainerStarted","Data":"5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946469 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jplbk" event={"ID":"3150273d-63f7-4908-bcc5-2403e123d1e7","Type":"ContainerStarted","Data":"7f1ee6e3a83ddf43c8ad216851c662b4e9a3878b1d55de426bd25cf10e59ddb7"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.946483 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.952623 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn"] Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.964480 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" event={"ID":"79859b30-67ee-456b-82e5-f8806347a0b9","Type":"ContainerStarted","Data":"e6839e71f42bf9813140ab9409d87011aad5df15e0b92baa26f0ae3dbd0708ac"} Jan 27 15:09:18 crc kubenswrapper[4772]: I0127 15:09:18.992710 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" event={"ID":"d1aba7eb-5916-4023-90f2-10152ad89b63","Type":"ContainerStarted","Data":"1cc65229ba685a5832f280c3c5752f78d182cfcc1da6b7cfa03ee16063a744c8"} Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.007246 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.028712 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.528691963 +0000 UTC m=+145.509301061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.029798 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pwmhd"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.054220 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hbbxh"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.082398 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-24dmv"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.098458 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.104216 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7qfrl" podStartSLOduration=123.104194734 podStartE2EDuration="2m3.104194734s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:19.073973011 +0000 UTC m=+145.054582109" watchObservedRunningTime="2026-01-27 15:09:19.104194734 +0000 UTC m=+145.084803832" Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.108485 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.108962 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.608942171 +0000 UTC m=+145.589551309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: W0127 15:09:19.119531 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4a3858_5afa_44c8_a435_2010f7e7340d.slice/crio-ae81c92d45d0547ff142b72b22b85da95795638755056827220d1a9427feb720 WatchSource:0}: Error finding container ae81c92d45d0547ff142b72b22b85da95795638755056827220d1a9427feb720: Status 404 returned error can't find the container with id ae81c92d45d0547ff142b72b22b85da95795638755056827220d1a9427feb720 Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.120773 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7k7sg" podStartSLOduration=123.120748842 podStartE2EDuration="2m3.120748842s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:19.106252673 +0000 UTC m=+145.086861781" watchObservedRunningTime="2026-01-27 15:09:19.120748842 +0000 UTC m=+145.101357940" Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.159064 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" podStartSLOduration=122.159043188 podStartE2EDuration="2m2.159043188s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:19.134722875 +0000 UTC m=+145.115331993" watchObservedRunningTime="2026-01-27 15:09:19.159043188 +0000 UTC m=+145.139652296" Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.161811 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.211567 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.211834 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.711807311 +0000 UTC m=+145.692416409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.212052 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.212502 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.712486071 +0000 UTC m=+145.693095179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: W0127 15:09:19.226902 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6b54ae2_d365_4988_8e69_704574c7962a.slice/crio-7de1df4376bfe65e7f653cd434cd6a00a28483c62ada492613581e939925776c WatchSource:0}: Error finding container 7de1df4376bfe65e7f653cd434cd6a00a28483c62ada492613581e939925776c: Status 404 returned error can't find the container with id 7de1df4376bfe65e7f653cd434cd6a00a28483c62ada492613581e939925776c Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.234138 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mzkn2" podStartSLOduration=123.234119656 podStartE2EDuration="2m3.234119656s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:19.23356676 +0000 UTC m=+145.214175878" watchObservedRunningTime="2026-01-27 15:09:19.234119656 +0000 UTC m=+145.214728764" Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.234644 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-58dhr" podStartSLOduration=6.23463554 podStartE2EDuration="6.23463554s" podCreationTimestamp="2026-01-27 15:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:19.212144421 +0000 UTC m=+145.192753519" watchObservedRunningTime="2026-01-27 15:09:19.23463554 +0000 UTC m=+145.215244638" Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.257482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.281048 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.292400 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-djmb4"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.294876 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7"] Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.317044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.317335 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.817322658 +0000 UTC m=+145.797931756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: W0127 15:09:19.322764 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55440e9e_5d99_4244_8b5c_55e2d270313b.slice/crio-f58ff164a8a285569068e112a435cf8823b51149c0f63ce1fea966019e44a1dc WatchSource:0}: Error finding container f58ff164a8a285569068e112a435cf8823b51149c0f63ce1fea966019e44a1dc: Status 404 returned error can't find the container with id f58ff164a8a285569068e112a435cf8823b51149c0f63ce1fea966019e44a1dc Jan 27 15:09:19 crc kubenswrapper[4772]: W0127 15:09:19.334898 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59cdf584_81d0_4d66_8fc2_da3a3f995f73.slice/crio-9d1cdd41fa91ae41d42bf9383e73d200fdebf020e8f9e1969deb26535791e0cc WatchSource:0}: Error finding container 9d1cdd41fa91ae41d42bf9383e73d200fdebf020e8f9e1969deb26535791e0cc: Status 404 returned error can't find the container with id 9d1cdd41fa91ae41d42bf9383e73d200fdebf020e8f9e1969deb26535791e0cc Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.418268 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.418591 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:19.918579082 +0000 UTC m=+145.899188180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.519025 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.519417 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.019395974 +0000 UTC m=+146.000005072 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.621240 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.622047 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.122026067 +0000 UTC m=+146.102635365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.655217 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:19 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:19 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:19 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.655294 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.723484 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.724076 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.224055084 +0000 UTC m=+146.204664182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.825480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.825896 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.325878894 +0000 UTC m=+146.306487992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:19 crc kubenswrapper[4772]: I0127 15:09:19.927001 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:19 crc kubenswrapper[4772]: E0127 15:09:19.927416 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.427398706 +0000 UTC m=+146.408007804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.000707 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" event={"ID":"794cdbb9-3392-465a-8a0a-a78a465aee2b","Type":"ContainerStarted","Data":"cf6cfdcc00e037a8fc2f5bf5dd6d157915e559e9d062c1cb653a624ab52284d0"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.002148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" event={"ID":"ce4b4e2e-496b-4334-8736-db4f25473731","Type":"ContainerStarted","Data":"60e71282f3e1dce71c46cf15179352066c7821472a93deaf7e3dc1987d0ad580"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.005602 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" event={"ID":"06cdc094-b372-4016-bc5e-4c15a28e032e","Type":"ContainerStarted","Data":"437c578755bfcacf0145c1b3dcede3b1938b4e11e6ad9c7db9d8ac6a8b6df37e"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.009798 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fw6bh" event={"ID":"f469e02c-9404-422a-bff0-1b945d9c8768","Type":"ContainerStarted","Data":"e8e2cc78c92118a9abbd5bc3282c48c2f6dffed936de924a7357cb4bf4e29f56"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.010139 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fw6bh" event={"ID":"f469e02c-9404-422a-bff0-1b945d9c8768","Type":"ContainerStarted","Data":"e386fe16c45d765b84619d4118b927f00e6e3264118ad353551e3583d8ea5056"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.013839 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" event={"ID":"06755863-1a8b-4f4d-a304-03bfd45725ec","Type":"ContainerStarted","Data":"b7b77accaed2345fee295b630eb5adb6b9631c110c247f9ecf3d3fc9ff16886f"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.015294 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" event={"ID":"3745a211-9fa8-41a7-aa26-d733431bc9aa","Type":"ContainerStarted","Data":"5e2bd2b743e8df88f7a26b59b03cc89fccfe5d20dd70c1895ce6c74553b1027b"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.016920 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" event={"ID":"59cdf584-81d0-4d66-8fc2-da3a3f995f73","Type":"ContainerStarted","Data":"9d1cdd41fa91ae41d42bf9383e73d200fdebf020e8f9e1969deb26535791e0cc"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.019160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" event={"ID":"79859b30-67ee-456b-82e5-f8806347a0b9","Type":"ContainerStarted","Data":"0e2ff311b3e5eef53c0624d4be740833a870996d5dce14243ce98f9b18306eda"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.020763 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" event={"ID":"1f03647c-a3e9-4099-9780-e79e3a4d4cf2","Type":"ContainerStarted","Data":"e0ecf6fe4fa34d98fd7c9648eec1e104d0f1c9d28c2d868a1d2c170c384ff0f0"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.020819 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" event={"ID":"1f03647c-a3e9-4099-9780-e79e3a4d4cf2","Type":"ContainerStarted","Data":"cabee604e502b0d0eec8b97ad150a591d062292394114ff343b9bbda957df6ac"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.021290 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.023464 4772 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mnltb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.023515 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" podUID="1f03647c-a3e9-4099-9780-e79e3a4d4cf2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.024485 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jplbk" event={"ID":"3150273d-63f7-4908-bcc5-2403e123d1e7","Type":"ContainerStarted","Data":"0c6c8c10410ade3e540c4fbbc835970cf958b2c46809a38faccfa0a9200de343"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.025773 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" event={"ID":"01d08621-494b-4232-b678-9caa94e61085","Type":"ContainerStarted","Data":"e840d87b8a5ee61f7c7dde102858ec11b60d0afb9d35b2281b41290cf928fec4"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.032908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.034112 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.533351216 +0000 UTC m=+146.513960314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.035843 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" event={"ID":"8800c113-7b51-4554-8e52-c1d0df1a08be","Type":"ContainerStarted","Data":"cb614bb5e97e5646988ab1d613a1a5c80cf9b21832653e0860dfbb43f8404b12"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.036048 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" podStartSLOduration=124.036029293 podStartE2EDuration="2m4.036029293s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.032032348 +0000 UTC m=+146.012641446" watchObservedRunningTime="2026-01-27 15:09:20.036029293 +0000 UTC m=+146.016638391" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.041037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" event={"ID":"db4a3858-5afa-44c8-a435-2010f7e7340d","Type":"ContainerStarted","Data":"ae81c92d45d0547ff142b72b22b85da95795638755056827220d1a9427feb720"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.044486 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" event={"ID":"c66d8c7d-2de8-492f-ba5e-7ff0e236bf64","Type":"ContainerStarted","Data":"673ad9af094d138247ba57d0ec9f4236fa565d62b07383106835f25e54fd66b8"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.058365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" event={"ID":"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3","Type":"ContainerStarted","Data":"738fd09154f46306d4ba55689df083e507d469edb2551766b356700d6cb0156e"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.067666 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" event={"ID":"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd","Type":"ContainerStarted","Data":"b930618ced9b6fc2ff90df7cee8a41329b4824295485280c8b28689467e39a1c"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.072330 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" event={"ID":"ffe33359-31f5-4a6c-93fc-6502d2516335","Type":"ContainerStarted","Data":"53fdace7b5e2131fd348c2954060cf41815d5871bf72cef5636e2bab12ed0b7a"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.072985 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" podStartSLOduration=123.07297003 podStartE2EDuration="2m3.07297003s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.072189217 +0000 UTC m=+146.052798315" watchObservedRunningTime="2026-01-27 15:09:20.07297003 +0000 UTC m=+146.053579128" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.073089 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c6n8m" podStartSLOduration=124.073083973 podStartE2EDuration="2m4.073083973s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.053755625 +0000 UTC m=+146.034364723" watchObservedRunningTime="2026-01-27 15:09:20.073083973 +0000 UTC m=+146.053693071" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.076616 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" event={"ID":"01d2c3f9-778c-4cf0-b8a4-76583f62df3c","Type":"ContainerStarted","Data":"68ec70224d1a508567ee2ab655eea34f597e6e0cc299689fa5a399faf743e0ab"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.076657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" event={"ID":"01d2c3f9-778c-4cf0-b8a4-76583f62df3c","Type":"ContainerStarted","Data":"993555ddd9411a0fbc04bff6d61b92ed6d128e38ba89d44582647ac7eae8fc72"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.077098 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.079494 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" event={"ID":"ee07e18c-9f40-41c3-b2fb-05fd325976e4","Type":"ContainerStarted","Data":"c2c3ba6783909759f21603251263d974ccddd9a3fab183c459ffd4bea1591850"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.079850 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.080499 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" event={"ID":"55440e9e-5d99-4244-8b5c-55e2d270313b","Type":"ContainerStarted","Data":"f58ff164a8a285569068e112a435cf8823b51149c0f63ce1fea966019e44a1dc"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.081371 4772 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-jdcpn container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.081430 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" podUID="01d2c3f9-778c-4cf0-b8a4-76583f62df3c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.081927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-npths" event={"ID":"5276546c-f731-4bd0-bb93-b5cd19b0992c","Type":"ContainerStarted","Data":"1d51339c275272b67e2e7b518c3e5c43f5ac5e12439635abbf3fb11f3545e804"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.082444 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.084051 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" event={"ID":"b3eefbc3-6dc4-479c-93e4-94a70fda0f83","Type":"ContainerStarted","Data":"1c65cf47eb5b99b895cec64ff223cf5872c35c6374b48f246c3d1c1fc1ecc64c"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.084997 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" event={"ID":"946946b3-f9e0-45e4-803f-edb3f7218489","Type":"ContainerStarted","Data":"cba1cd7e1e663bc926502749eb9df09ac4203e7c5246c4a725f052b689f32b97"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.086397 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" event={"ID":"c6b54ae2-d365-4988-8e69-704574c7962a","Type":"ContainerStarted","Data":"7de1df4376bfe65e7f653cd434cd6a00a28483c62ada492613581e939925776c"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.087293 4772 patch_prober.go:28] interesting pod/console-operator-58897d9998-npths container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.087322 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-npths" podUID="5276546c-f731-4bd0-bb93-b5cd19b0992c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.089670 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" event={"ID":"34e7a553-e424-472e-a143-76e7e08e57aa","Type":"ContainerStarted","Data":"2907069c7dc18352c253a6dd8a614f2fcda2e6ef3cfec82559f2ecaf55c235bf"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.091989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" event={"ID":"85774248-1879-439e-9dd2-0d8661c299d6","Type":"ContainerStarted","Data":"616c4f9696aa8b033d29e5e8dc00d721a29a5606281dc1c5344744ca66877594"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.095771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-vswtw" event={"ID":"17bdd07d-f7e5-47f8-b730-724d5cc8e3d2","Type":"ContainerStarted","Data":"ed60d396d7c416511dd8344b674f89e48671cdc264384930b775cd7370b999c1"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.096611 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.099645 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" event={"ID":"c8ebf890-c3b0-468e-bf7d-0ec590df084b","Type":"ContainerStarted","Data":"8632589c7dbe4bb64d8d2a9e0983c8088c1ff445e316f1dd7c4e04e72fa148df"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.099685 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" event={"ID":"c8ebf890-c3b0-468e-bf7d-0ec590df084b","Type":"ContainerStarted","Data":"267d22366b6b80c120159c7b29d573289ed71a1b2d51c437b57f97f84c344fdc"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.100244 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.103150 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4lj2h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.103213 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.105754 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" event={"ID":"855486b0-11f8-4ff0-930d-75c7e9d790d3","Type":"ContainerStarted","Data":"aea1a08e5aad39ed9a0a97079f3d1003246d905f1fda942ca2d0ca870f020eb3"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.109452 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.109705 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.123998 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fw6bh" podStartSLOduration=7.123973693 podStartE2EDuration="7.123973693s" podCreationTimestamp="2026-01-27 15:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.120669007 +0000 UTC m=+146.101278105" watchObservedRunningTime="2026-01-27 15:09:20.123973693 +0000 UTC m=+146.104582791" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.129758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" event={"ID":"dbd05529-6d54-416a-8df0-5973ee3179b6","Type":"ContainerStarted","Data":"f8a87a1260878fbad151d7e44e315c2aed855c2914f6ae6266e3f184df9702f6"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.134484 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.136353 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.63633539 +0000 UTC m=+146.616944488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.155640 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.155921 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.155995 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.156199 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.158053 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" event={"ID":"2c9eef7e-3996-45b6-ab7b-50d319dc1117","Type":"ContainerStarted","Data":"79c4e74a245ca3f2ef71fe6a698ce5a12f2c8cc3d935736d847f9759a8d4904d"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.165563 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" event={"ID":"192d0f8f-10f9-43e2-a24a-2019aae0db44","Type":"ContainerStarted","Data":"d49a7fd9cabdd4167adee013f8da89e57a16016ffc4a14822acbc6007a368377"} Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.176333 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" podStartSLOduration=123.176317214 podStartE2EDuration="2m3.176317214s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.150646393 +0000 UTC m=+146.131255491" watchObservedRunningTime="2026-01-27 15:09:20.176317214 +0000 UTC m=+146.156926312" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.176703 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" podStartSLOduration=124.176697355 podStartE2EDuration="2m4.176697355s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.175879182 +0000 UTC m=+146.156488300" watchObservedRunningTime="2026-01-27 15:09:20.176697355 +0000 UTC m=+146.157306453" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.195852 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-npths" podStartSLOduration=124.195834658 podStartE2EDuration="2m4.195834658s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.194433218 +0000 UTC m=+146.175042336" watchObservedRunningTime="2026-01-27 15:09:20.195834658 +0000 UTC m=+146.176443746" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.216348 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" podStartSLOduration=123.21631503 podStartE2EDuration="2m3.21631503s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.214106526 +0000 UTC m=+146.194715624" watchObservedRunningTime="2026-01-27 15:09:20.21631503 +0000 UTC m=+146.196924128" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.236306 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.239083 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.739041376 +0000 UTC m=+146.719650474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.239846 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" podStartSLOduration=124.239817698 podStartE2EDuration="2m4.239817698s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.234377771 +0000 UTC m=+146.214986869" watchObservedRunningTime="2026-01-27 15:09:20.239817698 +0000 UTC m=+146.220426796" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.263672 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" podStartSLOduration=124.263655197 podStartE2EDuration="2m4.263655197s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.262086751 +0000 UTC m=+146.242695849" watchObservedRunningTime="2026-01-27 15:09:20.263655197 +0000 UTC m=+146.244264295" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.268318 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.286802 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zkmjj" podStartSLOduration=123.286782174 podStartE2EDuration="2m3.286782174s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.284063866 +0000 UTC m=+146.264672964" watchObservedRunningTime="2026-01-27 15:09:20.286782174 +0000 UTC m=+146.267391272" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.322592 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9wv77" podStartSLOduration=123.322578358 podStartE2EDuration="2m3.322578358s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.306100052 +0000 UTC m=+146.286709150" watchObservedRunningTime="2026-01-27 15:09:20.322578358 +0000 UTC m=+146.303187456" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.323740 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-vswtw" podStartSLOduration=124.323732012 podStartE2EDuration="2m4.323732012s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.319635463 +0000 UTC m=+146.300244561" watchObservedRunningTime="2026-01-27 15:09:20.323732012 +0000 UTC m=+146.304341110" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.337428 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.337778 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.837763847 +0000 UTC m=+146.818372945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.348178 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-zq27x" podStartSLOduration=124.348151017 podStartE2EDuration="2m4.348151017s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.346070757 +0000 UTC m=+146.326679865" watchObservedRunningTime="2026-01-27 15:09:20.348151017 +0000 UTC m=+146.328760115" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.426015 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4v228" podStartSLOduration=124.425995445 podStartE2EDuration="2m4.425995445s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:20.418834068 +0000 UTC m=+146.399443166" watchObservedRunningTime="2026-01-27 15:09:20.425995445 +0000 UTC m=+146.406604543" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.438701 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.439112 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:20.939096893 +0000 UTC m=+146.919705981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.540370 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.542253 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.042232761 +0000 UTC m=+147.022841859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.643257 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.643630 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.143603779 +0000 UTC m=+147.124212877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.664406 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:20 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:20 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:20 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.664789 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.746646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.746948 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.246932693 +0000 UTC m=+147.227541781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.848922 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.849467 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.349447893 +0000 UTC m=+147.330057061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.950556 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.950840 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.450815 +0000 UTC m=+147.431424088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:20 crc kubenswrapper[4772]: I0127 15:09:20.950906 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:20 crc kubenswrapper[4772]: E0127 15:09:20.951416 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.451404097 +0000 UTC m=+147.432013185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.052635 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.052820 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.552786745 +0000 UTC m=+147.533395843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.052972 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.053363 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.553349801 +0000 UTC m=+147.533958899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.153967 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.154209 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.654157392 +0000 UTC m=+147.634766490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.154691 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.155039 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.655021137 +0000 UTC m=+147.635630235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.170309 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" event={"ID":"85774248-1879-439e-9dd2-0d8661c299d6","Type":"ContainerStarted","Data":"c77561dd43c0905e11cc9da17704b2525d4b7afef4f98f4af607bef043388148"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.172283 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" event={"ID":"946946b3-f9e0-45e4-803f-edb3f7218489","Type":"ContainerStarted","Data":"0ff467fa052293b06237dea9bb855a2e878b19713b29159eec9015ee3fa68cd1"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.174352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" event={"ID":"c6b54ae2-d365-4988-8e69-704574c7962a","Type":"ContainerStarted","Data":"4f5ed02624877f82608d4a7a5fead892a80497d0b63bf729eaa6c0d56cf6aac6"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.176365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" event={"ID":"06755863-1a8b-4f4d-a304-03bfd45725ec","Type":"ContainerStarted","Data":"83143922be92b5fa0adc3d1c04d2777313e5d1d785f87047a8d0acebd953a6ee"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.177914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" event={"ID":"3745a211-9fa8-41a7-aa26-d733431bc9aa","Type":"ContainerStarted","Data":"a5ebf3fad362d41f306cdc320239a12f9391ce6f794ea71a601b38d8cd31e4df"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.177957 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" event={"ID":"3745a211-9fa8-41a7-aa26-d733431bc9aa","Type":"ContainerStarted","Data":"2bce162515e7b97ab026e1bb12a122d93bb3ceabcd5847641abc7df9d4da0ccd"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.179698 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" event={"ID":"f9ab8468-b920-41e9-a5f2-1af70f1b5ffd","Type":"ContainerStarted","Data":"22a70d5916ada3cb7b0f3cfc5aa6a47b5d556678fc4040abd3c926b168155a0a"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.181203 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" event={"ID":"dbd05529-6d54-416a-8df0-5973ee3179b6","Type":"ContainerStarted","Data":"64ec27ae905d0cb73ec74a804453f8d9836f7bdc705f2062bf22fde875ba140a"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.182826 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" event={"ID":"b3eefbc3-6dc4-479c-93e4-94a70fda0f83","Type":"ContainerStarted","Data":"e9b923018e39913453724269344c98b4510c0d3afe324798fdc10365d8b1c22b"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.182975 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.183948 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" event={"ID":"c4354b42-c43f-45ce-b1e5-f1d6e0ed1bd3","Type":"ContainerStarted","Data":"48ed2cade3088ad9ef9e0aee7dc9a0a76326ddf212937d1b64722bb46b50b984"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.185442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" event={"ID":"01d08621-494b-4232-b678-9caa94e61085","Type":"ContainerStarted","Data":"082c258a684655d96d16487fead32cf89c30175a36b818bc16b83999d0295232"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.186111 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.187836 4772 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cf6v7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.187870 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" podUID="01d08621-494b-4232-b678-9caa94e61085" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.188191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jplbk" event={"ID":"3150273d-63f7-4908-bcc5-2403e123d1e7","Type":"ContainerStarted","Data":"0fe2fd92bdfb6082b1eced78bf89302ffcb2851203a712765ffc9978cba71854"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.188594 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.190019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" event={"ID":"794cdbb9-3392-465a-8a0a-a78a465aee2b","Type":"ContainerStarted","Data":"3a614d57463b1f07afccc480ba9fd3a454c9119bb8aa40cd8763206f105d3d25"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.191309 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" event={"ID":"db4a3858-5afa-44c8-a435-2010f7e7340d","Type":"ContainerStarted","Data":"dcc89b74c73c56621c851823bb624cba5d70cdd0a5cf057ba421dbea811cfd80"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.194327 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" event={"ID":"55440e9e-5d99-4244-8b5c-55e2d270313b","Type":"ContainerStarted","Data":"bd13bc4d5640fefd1ee1b05a0f76dd4bd95062a26abbdf38f089811f6ef9d615"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.194362 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" event={"ID":"55440e9e-5d99-4244-8b5c-55e2d270313b","Type":"ContainerStarted","Data":"15cc4626a580771e8008c7c13476968249bb59abaa3c5706dc853bb20066c974"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.200141 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" event={"ID":"8800c113-7b51-4554-8e52-c1d0df1a08be","Type":"ContainerStarted","Data":"6af6dcdc0b1ff60e46ab4aaec43713a1460bc2484adfbbac9528161c3edb645a"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.201388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" event={"ID":"59cdf584-81d0-4d66-8fc2-da3a3f995f73","Type":"ContainerStarted","Data":"a4f9e554de172277e02dd3d65d14bce9ec30394850fee45550108a3b8df8b44c"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.201424 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" event={"ID":"59cdf584-81d0-4d66-8fc2-da3a3f995f73","Type":"ContainerStarted","Data":"3f03f69ab90e8bff4b373cfa302b43affe0a5f955abea222fe2bbe3d950ac56d"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.202371 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fbwkz" event={"ID":"ffe33359-31f5-4a6c-93fc-6502d2516335","Type":"ContainerStarted","Data":"49e23824b145bf79c0d3e66cc5f10aad5c7da00f1e7e173bb52b34665a531f44"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.203911 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" event={"ID":"ce4b4e2e-496b-4334-8736-db4f25473731","Type":"ContainerStarted","Data":"ea62d58e4b98cc0c9d276cb1c41f819e098e87a78641d39d6c12c00942a8a45b"} Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204484 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204522 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204485 4772 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-jdcpn container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204734 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" podUID="01d2c3f9-778c-4cf0-b8a4-76583f62df3c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204825 4772 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mnltb container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204856 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" podUID="1f03647c-a3e9-4099-9780-e79e3a4d4cf2" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.204993 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.205139 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4lj2h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.205185 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.207407 4772 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fgw98 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" start-of-body= Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.207448 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": dial tcp 10.217.0.16:6443: connect: connection refused" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.218988 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-wk7gd" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.230796 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-djmb4" podStartSLOduration=124.230777425 podStartE2EDuration="2m4.230777425s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.230290161 +0000 UTC m=+147.210899269" watchObservedRunningTime="2026-01-27 15:09:21.230777425 +0000 UTC m=+147.211386523" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.232503 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-v7l4k" podStartSLOduration=124.232496215 podStartE2EDuration="2m4.232496215s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.198292967 +0000 UTC m=+147.178902065" watchObservedRunningTime="2026-01-27 15:09:21.232496215 +0000 UTC m=+147.213105313" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.255327 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.256736 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.756712764 +0000 UTC m=+147.737321932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.264623 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-w4j4h" podStartSLOduration=125.264609632 podStartE2EDuration="2m5.264609632s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.262231953 +0000 UTC m=+147.242841051" watchObservedRunningTime="2026-01-27 15:09:21.264609632 +0000 UTC m=+147.245218730" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.286375 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f9fqs" podStartSLOduration=125.28636129 podStartE2EDuration="2m5.28636129s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.284814796 +0000 UTC m=+147.265423894" watchObservedRunningTime="2026-01-27 15:09:21.28636129 +0000 UTC m=+147.266970388" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.351063 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-24dmv" podStartSLOduration=124.351023228 podStartE2EDuration="2m4.351023228s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.310698853 +0000 UTC m=+147.291307951" watchObservedRunningTime="2026-01-27 15:09:21.351023228 +0000 UTC m=+147.331632336" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.354280 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" podStartSLOduration=125.354265511 podStartE2EDuration="2m5.354265511s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.352950093 +0000 UTC m=+147.333559191" watchObservedRunningTime="2026-01-27 15:09:21.354265511 +0000 UTC m=+147.334874619" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.359375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.359674 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.859662337 +0000 UTC m=+147.840271435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.421109 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" podStartSLOduration=124.421088001 podStartE2EDuration="2m4.421088001s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.389673474 +0000 UTC m=+147.370282572" watchObservedRunningTime="2026-01-27 15:09:21.421088001 +0000 UTC m=+147.401697099" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.422437 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pwmhd" podStartSLOduration=125.42242856 podStartE2EDuration="2m5.42242856s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.419261308 +0000 UTC m=+147.399870416" watchObservedRunningTime="2026-01-27 15:09:21.42242856 +0000 UTC m=+147.403037658" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.437921 4772 patch_prober.go:28] interesting pod/apiserver-76f77b778f-2h2z8 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]log ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]etcd ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/generic-apiserver-start-informers ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/max-in-flight-filter ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 27 15:09:21 crc kubenswrapper[4772]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 27 15:09:21 crc kubenswrapper[4772]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/project.openshift.io-projectcache ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/openshift.io-startinformers ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 27 15:09:21 crc kubenswrapper[4772]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 27 15:09:21 crc kubenswrapper[4772]: livez check failed Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.438062 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" podUID="34e7a553-e424-472e-a143-76e7e08e57aa" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.448739 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jplbk" podStartSLOduration=8.448716929 podStartE2EDuration="8.448716929s" podCreationTimestamp="2026-01-27 15:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.446089123 +0000 UTC m=+147.426698251" watchObservedRunningTime="2026-01-27 15:09:21.448716929 +0000 UTC m=+147.429326027" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.460290 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.460786 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:21.960767797 +0000 UTC m=+147.941376885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.520981 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5vmlj" podStartSLOduration=124.520963215 podStartE2EDuration="2m4.520963215s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.472583868 +0000 UTC m=+147.453192966" watchObservedRunningTime="2026-01-27 15:09:21.520963215 +0000 UTC m=+147.501572313" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.522267 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-g667l" podStartSLOduration=125.522261913 podStartE2EDuration="2m5.522261913s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.508078663 +0000 UTC m=+147.488687761" watchObservedRunningTime="2026-01-27 15:09:21.522261913 +0000 UTC m=+147.502871011" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.564950 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.565341 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.065327186 +0000 UTC m=+148.045936284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.587790 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njx6w" podStartSLOduration=124.587772954 podStartE2EDuration="2m4.587772954s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.554440792 +0000 UTC m=+147.535049910" watchObservedRunningTime="2026-01-27 15:09:21.587772954 +0000 UTC m=+147.568382052" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.607351 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-klfsg" podStartSLOduration=125.607307619 podStartE2EDuration="2m5.607307619s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.607262377 +0000 UTC m=+147.587871485" watchObservedRunningTime="2026-01-27 15:09:21.607307619 +0000 UTC m=+147.587916717" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.608340 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" podStartSLOduration=124.608333428 podStartE2EDuration="2m4.608333428s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.588625999 +0000 UTC m=+147.569235107" watchObservedRunningTime="2026-01-27 15:09:21.608333428 +0000 UTC m=+147.588942526" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.614534 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-npths" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.654021 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:21 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:21 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:21 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.654371 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.665855 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.666339 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.166321413 +0000 UTC m=+148.146930511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.692799 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xqtff" podStartSLOduration=124.69278125700001 podStartE2EDuration="2m4.692781257s" podCreationTimestamp="2026-01-27 15:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.692380245 +0000 UTC m=+147.672989353" watchObservedRunningTime="2026-01-27 15:09:21.692781257 +0000 UTC m=+147.673390355" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.758758 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bhgv8" podStartSLOduration=125.758743022 podStartE2EDuration="2m5.758743022s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:21.730586089 +0000 UTC m=+147.711195187" watchObservedRunningTime="2026-01-27 15:09:21.758743022 +0000 UTC m=+147.739352120" Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.768044 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.768686 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.268669778 +0000 UTC m=+148.249278886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.868987 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.869204 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.36915747 +0000 UTC m=+148.349766568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.869357 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.869676 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.369661105 +0000 UTC m=+148.350270203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:21 crc kubenswrapper[4772]: I0127 15:09:21.969950 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:21 crc kubenswrapper[4772]: E0127 15:09:21.970248 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.470233508 +0000 UTC m=+148.450842606 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.071740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.072125 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.57210884 +0000 UTC m=+148.552717938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.173276 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.173467 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.673439906 +0000 UTC m=+148.654049004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.173544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.173838 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.673827277 +0000 UTC m=+148.654436365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.225065 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4lj2h container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.225113 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.225433 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.225483 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.274958 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.275088 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.775069461 +0000 UTC m=+148.755678559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.276955 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.282302 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.78228732 +0000 UTC m=+148.762896418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.378803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.379109 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.879093375 +0000 UTC m=+148.859702483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.392275 4772 csr.go:261] certificate signing request csr-x7cq6 is approved, waiting to be issued Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.397050 4772 csr.go:257] certificate signing request csr-x7cq6 is issued Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.481712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.482047 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:22.982036118 +0000 UTC m=+148.962645216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.582277 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.582478 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.082462598 +0000 UTC m=+149.063071696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.582781 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.583055 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.083047945 +0000 UTC m=+149.063657043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.583328 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.653154 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:22 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:22 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:22 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.653217 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.689661 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.689963 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.690014 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.690054 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.690097 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.691210 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.191185208 +0000 UTC m=+149.171794306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.691809 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.702207 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.702216 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.720683 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.767835 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bck4j" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.791636 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.792030 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.292015029 +0000 UTC m=+149.272624127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.800352 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.884856 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.894627 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:22 crc kubenswrapper[4772]: E0127 15:09:22.895072 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.395057355 +0000 UTC m=+149.375666453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.897522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:09:22 crc kubenswrapper[4772]: I0127 15:09:22.995831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.020266 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.52023638 +0000 UTC m=+149.500845478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.103897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.104474 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.604452492 +0000 UTC m=+149.585061610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.205915 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.206455 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.706442817 +0000 UTC m=+149.687051915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.233706 4772 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-cf6v7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.233758 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" podUID="01d08621-494b-4232-b678-9caa94e61085" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.267354 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" event={"ID":"794cdbb9-3392-465a-8a0a-a78a465aee2b","Type":"ContainerStarted","Data":"458ba566568151ced1efe9b01df96546d245e5b95865522f4163f8c7b9dc4145"} Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.267393 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" event={"ID":"794cdbb9-3392-465a-8a0a-a78a465aee2b","Type":"ContainerStarted","Data":"12b644f033abf82b2aa10593fc8f1f5c89fe7b2f0adfc95ab3ba0235049af613"} Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.307657 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.309012 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.808997729 +0000 UTC m=+149.789606827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.344799 4772 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.399267 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-27 15:04:22 +0000 UTC, rotation deadline is 2026-12-04 07:51:36.577008503 +0000 UTC Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.399306 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7456h42m13.177705526s for next certificate rotation Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.410791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.411078 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:23.911065496 +0000 UTC m=+149.891674594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.489194 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95rh9"] Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.490121 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.499599 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.511863 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.512300 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:24.012284599 +0000 UTC m=+149.992893697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.558585 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95rh9"] Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.613853 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-utilities\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.613893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-catalog-content\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.613944 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvrk\" (UniqueName: \"kubernetes.io/projected/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-kube-api-access-4zvrk\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.613974 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.614271 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:24.114260424 +0000 UTC m=+150.094869522 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.656365 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:23 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:23 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:23 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.656831 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.682488 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9wdps"] Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.697968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.701209 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.714589 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.714734 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-utilities\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.714757 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-catalog-content\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.714780 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvrk\" (UniqueName: \"kubernetes.io/projected/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-kube-api-access-4zvrk\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.715099 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:09:24.215085716 +0000 UTC m=+150.195694814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.715499 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-utilities\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.715622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-catalog-content\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.721631 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wdps"] Jan 27 15:09:23 crc kubenswrapper[4772]: W0127 15:09:23.752738 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-c6ced8694bd7595fee00b262ac36c45bd68f29fbefa1dc4a8eaae1198c1d0493 WatchSource:0}: Error finding container c6ced8694bd7595fee00b262ac36c45bd68f29fbefa1dc4a8eaae1198c1d0493: Status 404 returned error can't find the container with id c6ced8694bd7595fee00b262ac36c45bd68f29fbefa1dc4a8eaae1198c1d0493 Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.765196 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvrk\" (UniqueName: \"kubernetes.io/projected/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-kube-api-access-4zvrk\") pod \"community-operators-95rh9\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.815474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-catalog-content\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.815549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-utilities\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.815618 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.815645 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fhkv\" (UniqueName: \"kubernetes.io/projected/96e88efd-1f25-4e44-b459-ab773db93656-kube-api-access-7fhkv\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: E0127 15:09:23.816000 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:09:24.31598888 +0000 UTC m=+150.296597968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-crlcr" (UID: "877de785-bc18-4c1c-970a-1e6533539467") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.844512 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.859906 4772 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-27T15:09:23.344822003Z","Handler":null,"Name":""} Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.865894 4772 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.865939 4772 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.881803 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jwrpk"] Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.885015 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.898401 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwrpk"] Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918181 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fhkv\" (UniqueName: \"kubernetes.io/projected/96e88efd-1f25-4e44-b459-ab773db93656-kube-api-access-7fhkv\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-catalog-content\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918483 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67q8x\" (UniqueName: \"kubernetes.io/projected/dd415ccf-2b4a-4797-962f-a464ef96bc22-kube-api-access-67q8x\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918513 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-utilities\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-utilities\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.918573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-catalog-content\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.919451 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-catalog-content\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.919493 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-utilities\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.928253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 15:09:23 crc kubenswrapper[4772]: I0127 15:09:23.951868 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fhkv\" (UniqueName: \"kubernetes.io/projected/96e88efd-1f25-4e44-b459-ab773db93656-kube-api-access-7fhkv\") pod \"certified-operators-9wdps\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.020323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-utilities\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.020657 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-catalog-content\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.020707 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.020765 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67q8x\" (UniqueName: \"kubernetes.io/projected/dd415ccf-2b4a-4797-962f-a464ef96bc22-kube-api-access-67q8x\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.020967 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cf6v7" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.021575 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-utilities\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.021594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-catalog-content\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.024943 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.027819 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.027853 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.044941 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67q8x\" (UniqueName: \"kubernetes.io/projected/dd415ccf-2b4a-4797-962f-a464ef96bc22-kube-api-access-67q8x\") pod \"community-operators-jwrpk\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.094872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-crlcr\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.114295 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wcldz"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.116415 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.162775 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wcldz"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.208986 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95rh9"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.216007 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.223806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-utilities\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.223897 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9qx\" (UniqueName: \"kubernetes.io/projected/987488b4-af4d-4b20-bb26-f433d4d1299a-kube-api-access-sp9qx\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.224013 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-catalog-content\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.288935 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"12b334f2ce60be8c17d4eadace35220018c7dd18ca877512f3960dae41a53451"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.288978 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c6ced8694bd7595fee00b262ac36c45bd68f29fbefa1dc4a8eaae1198c1d0493"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.292055 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"065902ad26e91894adf3e424b61f04e80232696bd334ca7b0c2dadf70f3db0ea"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.292088 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"25ed67dd4363c69e0895784cd995321f5be2d7e99c48536df754953077e3f64b"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.292283 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.294523 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" event={"ID":"794cdbb9-3392-465a-8a0a-a78a465aee2b","Type":"ContainerStarted","Data":"00440aa18fc91af0074763f2776da1676a6e036d0e6f05e019e4e01bcb358a23"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.305614 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rh9" event={"ID":"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7","Type":"ContainerStarted","Data":"ae198d4139eb016b136b591cf513d4e1d588e78f4cc1966851a08fad44048adb"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.311357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7f3417080ebdeac18b7818cfe03adc19876ba742c9b98f9f66e8089e83a729af"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.311403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"600800860d1d7b700cbc19d04957ce86b9999ffdbabf36c53b619fc569f4ba28"} Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.324980 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-catalog-content\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.325049 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-utilities\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.325102 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9qx\" (UniqueName: \"kubernetes.io/projected/987488b4-af4d-4b20-bb26-f433d4d1299a-kube-api-access-sp9qx\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.326314 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-catalog-content\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.326623 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-utilities\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.364250 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9qx\" (UniqueName: \"kubernetes.io/projected/987488b4-af4d-4b20-bb26-f433d4d1299a-kube-api-access-sp9qx\") pod \"certified-operators-wcldz\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.382115 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hbbxh" podStartSLOduration=11.382087837 podStartE2EDuration="11.382087837s" podCreationTimestamp="2026-01-27 15:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:24.351372211 +0000 UTC m=+150.331981299" watchObservedRunningTime="2026-01-27 15:09:24.382087837 +0000 UTC m=+150.362696945" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.394914 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.477543 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.519981 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9wdps"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.602803 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jwrpk"] Jan 27 15:09:24 crc kubenswrapper[4772]: W0127 15:09:24.630832 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd415ccf_2b4a_4797_962f_a464ef96bc22.slice/crio-625e8aba305d47657203f1b68ee02b451e267b67921c5b887eaa601500155d6c WatchSource:0}: Error finding container 625e8aba305d47657203f1b68ee02b451e267b67921c5b887eaa601500155d6c: Status 404 returned error can't find the container with id 625e8aba305d47657203f1b68ee02b451e267b67921c5b887eaa601500155d6c Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.652199 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:24 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:24 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:24 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.652265 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.676488 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.696216 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-crlcr"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.923469 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.924647 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.926100 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.931660 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.931826 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 15:09:24 crc kubenswrapper[4772]: I0127 15:09:24.999728 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wcldz"] Jan 27 15:09:25 crc kubenswrapper[4772]: W0127 15:09:25.001127 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod987488b4_af4d_4b20_bb26_f433d4d1299a.slice/crio-cca259e2810aba1b9b6e47d70088b4b43bd08d57432f8fbb61e7ddcd0a7abb94 WatchSource:0}: Error finding container cca259e2810aba1b9b6e47d70088b4b43bd08d57432f8fbb61e7ddcd0a7abb94: Status 404 returned error can't find the container with id cca259e2810aba1b9b6e47d70088b4b43bd08d57432f8fbb61e7ddcd0a7abb94 Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.046035 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53a6c4ad-816a-4d22-af98-0587a6a68304-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.046080 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53a6c4ad-816a-4d22-af98-0587a6a68304-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.147768 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53a6c4ad-816a-4d22-af98-0587a6a68304-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.147845 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53a6c4ad-816a-4d22-af98-0587a6a68304-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.147917 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53a6c4ad-816a-4d22-af98-0587a6a68304-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.168519 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.176934 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53a6c4ad-816a-4d22-af98-0587a6a68304-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.180825 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-2h2z8" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.248914 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.348251 4772 generic.go:334] "Generic (PLEG): container finished" podID="c6b54ae2-d365-4988-8e69-704574c7962a" containerID="4f5ed02624877f82608d4a7a5fead892a80497d0b63bf729eaa6c0d56cf6aac6" exitCode=0 Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.348365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" event={"ID":"c6b54ae2-d365-4988-8e69-704574c7962a","Type":"ContainerDied","Data":"4f5ed02624877f82608d4a7a5fead892a80497d0b63bf729eaa6c0d56cf6aac6"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.363097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcldz" event={"ID":"987488b4-af4d-4b20-bb26-f433d4d1299a","Type":"ContainerStarted","Data":"cca259e2810aba1b9b6e47d70088b4b43bd08d57432f8fbb61e7ddcd0a7abb94"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.375259 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" event={"ID":"877de785-bc18-4c1c-970a-1e6533539467","Type":"ContainerStarted","Data":"a3585a039b9cbf60a67ac7ced2eaf947fce2a88abe7705503eb446ef5ad9fc74"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.376353 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.378177 4772 generic.go:334] "Generic (PLEG): container finished" podID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerID="16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd" exitCode=0 Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.378228 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rh9" event={"ID":"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7","Type":"ContainerDied","Data":"16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.379957 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.381337 4772 generic.go:334] "Generic (PLEG): container finished" podID="96e88efd-1f25-4e44-b459-ab773db93656" containerID="19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81" exitCode=0 Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.381388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wdps" event={"ID":"96e88efd-1f25-4e44-b459-ab773db93656","Type":"ContainerDied","Data":"19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.381411 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wdps" event={"ID":"96e88efd-1f25-4e44-b459-ab773db93656","Type":"ContainerStarted","Data":"806bc56d016ac75a91f0a1effbd4a1494b65e83f8a957a34cd38d253ad927cc3"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.387251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwrpk" event={"ID":"dd415ccf-2b4a-4797-962f-a464ef96bc22","Type":"ContainerStarted","Data":"625e8aba305d47657203f1b68ee02b451e267b67921c5b887eaa601500155d6c"} Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.419540 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" podStartSLOduration=129.419512406 podStartE2EDuration="2m9.419512406s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:25.39607262 +0000 UTC m=+151.376681718" watchObservedRunningTime="2026-01-27 15:09:25.419512406 +0000 UTC m=+151.400121504" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.443072 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.445154 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.448674 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.452901 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.477704 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.555883 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/545a0fd1-38f1-4cbc-9f37-1870b1673589-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.555961 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545a0fd1-38f1-4cbc-9f37-1870b1673589-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.595088 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.656320 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:25 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:25 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:25 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.656527 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.657294 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545a0fd1-38f1-4cbc-9f37-1870b1673589-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.657374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/545a0fd1-38f1-4cbc-9f37-1870b1673589-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.657436 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/545a0fd1-38f1-4cbc-9f37-1870b1673589-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.691610 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xp8ph"] Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.698460 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.704118 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545a0fd1-38f1-4cbc-9f37-1870b1673589-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.706766 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.707375 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xp8ph"] Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.758532 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-utilities\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.758586 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-catalog-content\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.758616 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g6kk\" (UniqueName: \"kubernetes.io/projected/ac2b5800-ce98-4847-bfcd-67a97375aa1b-kube-api-access-7g6kk\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.789553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.859814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-utilities\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.859882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-catalog-content\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.859923 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g6kk\" (UniqueName: \"kubernetes.io/projected/ac2b5800-ce98-4847-bfcd-67a97375aa1b-kube-api-access-7g6kk\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.860682 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-utilities\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.860884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-catalog-content\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:25 crc kubenswrapper[4772]: I0127 15:09:25.887825 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g6kk\" (UniqueName: \"kubernetes.io/projected/ac2b5800-ce98-4847-bfcd-67a97375aa1b-kube-api-access-7g6kk\") pod \"redhat-marketplace-xp8ph\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.024809 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.070959 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dfgjh"] Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.072095 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.120628 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfgjh"] Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.163981 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-utilities\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.164097 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-catalog-content\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.164153 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87w6q\" (UniqueName: \"kubernetes.io/projected/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-kube-api-access-87w6q\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.206044 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.238205 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xp8ph"] Jan 27 15:09:26 crc kubenswrapper[4772]: W0127 15:09:26.251242 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2b5800_ce98_4847_bfcd_67a97375aa1b.slice/crio-d8c8cf461f8b99ac3badb881b8d4938b4c0d57c4110c08a8b732cf718594c112 WatchSource:0}: Error finding container d8c8cf461f8b99ac3badb881b8d4938b4c0d57c4110c08a8b732cf718594c112: Status 404 returned error can't find the container with id d8c8cf461f8b99ac3badb881b8d4938b4c0d57c4110c08a8b732cf718594c112 Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.265096 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-utilities\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.265179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-catalog-content\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.265244 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87w6q\" (UniqueName: \"kubernetes.io/projected/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-kube-api-access-87w6q\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.265620 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-utilities\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.274343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-catalog-content\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.283406 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87w6q\" (UniqueName: \"kubernetes.io/projected/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-kube-api-access-87w6q\") pod \"redhat-marketplace-dfgjh\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.389070 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.396650 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xp8ph" event={"ID":"ac2b5800-ce98-4847-bfcd-67a97375aa1b","Type":"ContainerStarted","Data":"d8c8cf461f8b99ac3badb881b8d4938b4c0d57c4110c08a8b732cf718594c112"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.398777 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" event={"ID":"877de785-bc18-4c1c-970a-1e6533539467","Type":"ContainerStarted","Data":"228e6fd0668bf433c1f6aa09021f79564dfe5e7bb750301de0ab0cbfce9f1ef2"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.400904 4772 generic.go:334] "Generic (PLEG): container finished" podID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerID="759437d05ef598aec5d4669f7ffea07fc52730444984e390ed6235fe2f84e271" exitCode=0 Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.401084 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwrpk" event={"ID":"dd415ccf-2b4a-4797-962f-a464ef96bc22","Type":"ContainerDied","Data":"759437d05ef598aec5d4669f7ffea07fc52730444984e390ed6235fe2f84e271"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.407981 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53a6c4ad-816a-4d22-af98-0587a6a68304","Type":"ContainerStarted","Data":"63104e6a47ff51e625e55ab363bafa9155dc299b2880304fb3771d251df37ba5"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.408018 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53a6c4ad-816a-4d22-af98-0587a6a68304","Type":"ContainerStarted","Data":"4e5fb5c0a561c986217a03a765b2818013365a6cf31c0cb2523e728110829fe1"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.412917 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"545a0fd1-38f1-4cbc-9f37-1870b1673589","Type":"ContainerStarted","Data":"fb9d01e0f3553308e4bfd0ea1eb0d0086a446ca7d54e19f2aad89776bdcb2b2b"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.419024 4772 generic.go:334] "Generic (PLEG): container finished" podID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerID="05b1bd7fef5819af5a4449e0a65b00c0b94c405308dc0ea03120ac4091e22a7a" exitCode=0 Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.420767 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcldz" event={"ID":"987488b4-af4d-4b20-bb26-f433d4d1299a","Type":"ContainerDied","Data":"05b1bd7fef5819af5a4449e0a65b00c0b94c405308dc0ea03120ac4091e22a7a"} Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.439254 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.439229263 podStartE2EDuration="2.439229263s" podCreationTimestamp="2026-01-27 15:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:09:26.431342785 +0000 UTC m=+152.411951903" watchObservedRunningTime="2026-01-27 15:09:26.439229263 +0000 UTC m=+152.419838361" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.540614 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.541001 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.559324 4772 patch_prober.go:28] interesting pod/console-f9d7485db-7qfrl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.559387 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7qfrl" podUID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.592221 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.592296 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.592480 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.592533 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.647627 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.672694 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:26 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:26 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:26 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.672752 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.684854 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-75jrg"] Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.687421 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.693621 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.696402 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75jrg"] Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.761776 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.807565 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfgjh"] Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.878577 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b54ae2-d365-4988-8e69-704574c7962a-config-volume\") pod \"c6b54ae2-d365-4988-8e69-704574c7962a\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.878652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b54ae2-d365-4988-8e69-704574c7962a-secret-volume\") pod \"c6b54ae2-d365-4988-8e69-704574c7962a\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.878704 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llq59\" (UniqueName: \"kubernetes.io/projected/c6b54ae2-d365-4988-8e69-704574c7962a-kube-api-access-llq59\") pod \"c6b54ae2-d365-4988-8e69-704574c7962a\" (UID: \"c6b54ae2-d365-4988-8e69-704574c7962a\") " Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.879083 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxxl\" (UniqueName: \"kubernetes.io/projected/f637b998-b13b-486d-9042-4cd40a01c833-kube-api-access-vvxxl\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.879158 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-utilities\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.879262 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-catalog-content\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.881497 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6b54ae2-d365-4988-8e69-704574c7962a-config-volume" (OuterVolumeSpecName: "config-volume") pod "c6b54ae2-d365-4988-8e69-704574c7962a" (UID: "c6b54ae2-d365-4988-8e69-704574c7962a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.886523 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6b54ae2-d365-4988-8e69-704574c7962a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c6b54ae2-d365-4988-8e69-704574c7962a" (UID: "c6b54ae2-d365-4988-8e69-704574c7962a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.886675 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6b54ae2-d365-4988-8e69-704574c7962a-kube-api-access-llq59" (OuterVolumeSpecName: "kube-api-access-llq59") pod "c6b54ae2-d365-4988-8e69-704574c7962a" (UID: "c6b54ae2-d365-4988-8e69-704574c7962a"). InnerVolumeSpecName "kube-api-access-llq59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981376 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-catalog-content\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981439 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxxl\" (UniqueName: \"kubernetes.io/projected/f637b998-b13b-486d-9042-4cd40a01c833-kube-api-access-vvxxl\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981487 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-utilities\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981532 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c6b54ae2-d365-4988-8e69-704574c7962a-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981544 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c6b54ae2-d365-4988-8e69-704574c7962a-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981553 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llq59\" (UniqueName: \"kubernetes.io/projected/c6b54ae2-d365-4988-8e69-704574c7962a-kube-api-access-llq59\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.981927 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-utilities\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:26 crc kubenswrapper[4772]: I0127 15:09:26.982142 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-catalog-content\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.002421 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxxl\" (UniqueName: \"kubernetes.io/projected/f637b998-b13b-486d-9042-4cd40a01c833-kube-api-access-vvxxl\") pod \"redhat-operators-75jrg\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.029108 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.069245 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-jdcpn" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.073392 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k7pfr"] Jan 27 15:09:27 crc kubenswrapper[4772]: E0127 15:09:27.073613 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6b54ae2-d365-4988-8e69-704574c7962a" containerName="collect-profiles" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.073627 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6b54ae2-d365-4988-8e69-704574c7962a" containerName="collect-profiles" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.073734 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6b54ae2-d365-4988-8e69-704574c7962a" containerName="collect-profiles" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.074538 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.083335 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhcz\" (UniqueName: \"kubernetes.io/projected/d0b33686-8107-4caf-b67f-3c608119a049-kube-api-access-fnhcz\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.083401 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-utilities\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.083425 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-catalog-content\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.095201 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7pfr"] Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.148492 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mnltb" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.184109 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-catalog-content\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.184501 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhcz\" (UniqueName: \"kubernetes.io/projected/d0b33686-8107-4caf-b67f-3c608119a049-kube-api-access-fnhcz\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.184570 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-utilities\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.184958 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-utilities\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.185161 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-catalog-content\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.222372 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhcz\" (UniqueName: \"kubernetes.io/projected/d0b33686-8107-4caf-b67f-3c608119a049-kube-api-access-fnhcz\") pod \"redhat-operators-k7pfr\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.345598 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.391849 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.427979 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" event={"ID":"c6b54ae2-d365-4988-8e69-704574c7962a","Type":"ContainerDied","Data":"7de1df4376bfe65e7f653cd434cd6a00a28483c62ada492613581e939925776c"} Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.428020 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de1df4376bfe65e7f653cd434cd6a00a28483c62ada492613581e939925776c" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.428127 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.444772 4772 generic.go:334] "Generic (PLEG): container finished" podID="53a6c4ad-816a-4d22-af98-0587a6a68304" containerID="63104e6a47ff51e625e55ab363bafa9155dc299b2880304fb3771d251df37ba5" exitCode=0 Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.444830 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53a6c4ad-816a-4d22-af98-0587a6a68304","Type":"ContainerDied","Data":"63104e6a47ff51e625e55ab363bafa9155dc299b2880304fb3771d251df37ba5"} Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.447350 4772 generic.go:334] "Generic (PLEG): container finished" podID="545a0fd1-38f1-4cbc-9f37-1870b1673589" containerID="4635120e4ed4dd9ae851ebf881fd2e955918cdec5e9f2fb5b5236a34009790a9" exitCode=0 Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.447377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"545a0fd1-38f1-4cbc-9f37-1870b1673589","Type":"ContainerDied","Data":"4635120e4ed4dd9ae851ebf881fd2e955918cdec5e9f2fb5b5236a34009790a9"} Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.475426 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerID="2b642630dd7f7b63f30ba841d8958c4ae79e62858aee5dff568bed444b47b036" exitCode=0 Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.475558 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xp8ph" event={"ID":"ac2b5800-ce98-4847-bfcd-67a97375aa1b","Type":"ContainerDied","Data":"2b642630dd7f7b63f30ba841d8958c4ae79e62858aee5dff568bed444b47b036"} Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.490405 4772 generic.go:334] "Generic (PLEG): container finished" podID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerID="e90d644c15f4a502d49563577bfa11dc77829d65c3be871a6762542c2dc18bcb" exitCode=0 Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.491612 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfgjh" event={"ID":"8cbabfa8-79d8-4b23-b186-b40ba8b3017e","Type":"ContainerDied","Data":"e90d644c15f4a502d49563577bfa11dc77829d65c3be871a6762542c2dc18bcb"} Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.491653 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfgjh" event={"ID":"8cbabfa8-79d8-4b23-b186-b40ba8b3017e","Type":"ContainerStarted","Data":"a43ec03d3d30e58e4b1f455c9ffe4cf357362515c513668efc22ad8bedf315c3"} Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.650835 4772 patch_prober.go:28] interesting pod/router-default-5444994796-7k7sg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:09:27 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Jan 27 15:09:27 crc kubenswrapper[4772]: [+]process-running ok Jan 27 15:09:27 crc kubenswrapper[4772]: healthz check failed Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.651239 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7k7sg" podUID="c4225ddc-bdcd-4158-811b-113234d0c3d0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.652306 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75jrg"] Jan 27 15:09:27 crc kubenswrapper[4772]: W0127 15:09:27.697560 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf637b998_b13b_486d_9042_4cd40a01c833.slice/crio-e6520bb81fcb94747f96d0bde0c4f99cc40f51207c4752c3b45e79ef35a202b3 WatchSource:0}: Error finding container e6520bb81fcb94747f96d0bde0c4f99cc40f51207c4752c3b45e79ef35a202b3: Status 404 returned error can't find the container with id e6520bb81fcb94747f96d0bde0c4f99cc40f51207c4752c3b45e79ef35a202b3 Jan 27 15:09:27 crc kubenswrapper[4772]: I0127 15:09:27.700827 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k7pfr"] Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.499156 4772 generic.go:334] "Generic (PLEG): container finished" podID="d0b33686-8107-4caf-b67f-3c608119a049" containerID="142c7ce0c1e97146a8a91a5cb46adbe2ee5537497547ac756fc38abfd7afe96c" exitCode=0 Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.499219 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerDied","Data":"142c7ce0c1e97146a8a91a5cb46adbe2ee5537497547ac756fc38abfd7afe96c"} Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.499487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerStarted","Data":"73b204aeb508c9b52f9299efc082b371c9a1b4cb36fb76b5cd3cb0d31f443821"} Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.501654 4772 generic.go:334] "Generic (PLEG): container finished" podID="f637b998-b13b-486d-9042-4cd40a01c833" containerID="980163ea21706e95ea0803e7c47d4cf7427d498f1351e03fac04539f250bddc6" exitCode=0 Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.501737 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75jrg" event={"ID":"f637b998-b13b-486d-9042-4cd40a01c833","Type":"ContainerDied","Data":"980163ea21706e95ea0803e7c47d4cf7427d498f1351e03fac04539f250bddc6"} Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.501763 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75jrg" event={"ID":"f637b998-b13b-486d-9042-4cd40a01c833","Type":"ContainerStarted","Data":"e6520bb81fcb94747f96d0bde0c4f99cc40f51207c4752c3b45e79ef35a202b3"} Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.651218 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.653525 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7k7sg" Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.852080 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:28 crc kubenswrapper[4772]: I0127 15:09:28.988469 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.034559 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53a6c4ad-816a-4d22-af98-0587a6a68304-kubelet-dir\") pod \"53a6c4ad-816a-4d22-af98-0587a6a68304\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.034774 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53a6c4ad-816a-4d22-af98-0587a6a68304-kube-api-access\") pod \"53a6c4ad-816a-4d22-af98-0587a6a68304\" (UID: \"53a6c4ad-816a-4d22-af98-0587a6a68304\") " Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.035282 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53a6c4ad-816a-4d22-af98-0587a6a68304-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "53a6c4ad-816a-4d22-af98-0587a6a68304" (UID: "53a6c4ad-816a-4d22-af98-0587a6a68304"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.043556 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a6c4ad-816a-4d22-af98-0587a6a68304-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "53a6c4ad-816a-4d22-af98-0587a6a68304" (UID: "53a6c4ad-816a-4d22-af98-0587a6a68304"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.135565 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545a0fd1-38f1-4cbc-9f37-1870b1673589-kube-api-access\") pod \"545a0fd1-38f1-4cbc-9f37-1870b1673589\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.135754 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/545a0fd1-38f1-4cbc-9f37-1870b1673589-kubelet-dir\") pod \"545a0fd1-38f1-4cbc-9f37-1870b1673589\" (UID: \"545a0fd1-38f1-4cbc-9f37-1870b1673589\") " Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.136055 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53a6c4ad-816a-4d22-af98-0587a6a68304-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.136080 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53a6c4ad-816a-4d22-af98-0587a6a68304-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.136144 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/545a0fd1-38f1-4cbc-9f37-1870b1673589-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "545a0fd1-38f1-4cbc-9f37-1870b1673589" (UID: "545a0fd1-38f1-4cbc-9f37-1870b1673589"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.141514 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/545a0fd1-38f1-4cbc-9f37-1870b1673589-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "545a0fd1-38f1-4cbc-9f37-1870b1673589" (UID: "545a0fd1-38f1-4cbc-9f37-1870b1673589"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.237601 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/545a0fd1-38f1-4cbc-9f37-1870b1673589-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.237676 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/545a0fd1-38f1-4cbc-9f37-1870b1673589-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.536831 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"53a6c4ad-816a-4d22-af98-0587a6a68304","Type":"ContainerDied","Data":"4e5fb5c0a561c986217a03a765b2818013365a6cf31c0cb2523e728110829fe1"} Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.537137 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e5fb5c0a561c986217a03a765b2818013365a6cf31c0cb2523e728110829fe1" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.537214 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.566513 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.566860 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"545a0fd1-38f1-4cbc-9f37-1870b1673589","Type":"ContainerDied","Data":"fb9d01e0f3553308e4bfd0ea1eb0d0086a446ca7d54e19f2aad89776bdcb2b2b"} Jan 27 15:09:29 crc kubenswrapper[4772]: I0127 15:09:29.566913 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb9d01e0f3553308e4bfd0ea1eb0d0086a446ca7d54e19f2aad89776bdcb2b2b" Jan 27 15:09:31 crc kubenswrapper[4772]: I0127 15:09:31.630337 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jplbk" Jan 27 15:09:36 crc kubenswrapper[4772]: I0127 15:09:36.557298 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:36 crc kubenswrapper[4772]: I0127 15:09:36.561620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:09:36 crc kubenswrapper[4772]: I0127 15:09:36.591739 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:36 crc kubenswrapper[4772]: I0127 15:09:36.591808 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:36 crc kubenswrapper[4772]: I0127 15:09:36.591998 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-vswtw container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 27 15:09:36 crc kubenswrapper[4772]: I0127 15:09:36.604562 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-vswtw" podUID="17bdd07d-f7e5-47f8-b730-724d5cc8e3d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 27 15:09:38 crc kubenswrapper[4772]: I0127 15:09:38.430203 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:09:38 crc kubenswrapper[4772]: I0127 15:09:38.435838 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/371016c8-5a23-427d-aa0a-0faa241d86a7-metrics-certs\") pod \"network-metrics-daemon-ql2vx\" (UID: \"371016c8-5a23-427d-aa0a-0faa241d86a7\") " pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:09:38 crc kubenswrapper[4772]: I0127 15:09:38.684687 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ql2vx" Jan 27 15:09:42 crc kubenswrapper[4772]: I0127 15:09:42.058540 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:09:42 crc kubenswrapper[4772]: I0127 15:09:42.058618 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:09:44 crc kubenswrapper[4772]: I0127 15:09:44.399067 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:09:46 crc kubenswrapper[4772]: I0127 15:09:46.595013 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-vswtw" Jan 27 15:09:56 crc kubenswrapper[4772]: I0127 15:09:56.981945 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cv2z7" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.303548 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 15:10:02 crc kubenswrapper[4772]: E0127 15:10:02.304076 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="545a0fd1-38f1-4cbc-9f37-1870b1673589" containerName="pruner" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.304088 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="545a0fd1-38f1-4cbc-9f37-1870b1673589" containerName="pruner" Jan 27 15:10:02 crc kubenswrapper[4772]: E0127 15:10:02.304104 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a6c4ad-816a-4d22-af98-0587a6a68304" containerName="pruner" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.304110 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a6c4ad-816a-4d22-af98-0587a6a68304" containerName="pruner" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.304254 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="545a0fd1-38f1-4cbc-9f37-1870b1673589" containerName="pruner" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.304264 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a6c4ad-816a-4d22-af98-0587a6a68304" containerName="pruner" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.304719 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.307413 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.308293 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.310769 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.453974 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.454047 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.555117 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.555339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.555533 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.579669 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.626834 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:02 crc kubenswrapper[4772]: I0127 15:10:02.905566 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:10:06 crc kubenswrapper[4772]: E0127 15:10:06.076433 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 27 15:10:06 crc kubenswrapper[4772]: E0127 15:10:06.077123 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sp9qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wcldz_openshift-marketplace(987488b4-af4d-4b20-bb26-f433d4d1299a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:06 crc kubenswrapper[4772]: E0127 15:10:06.078285 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wcldz" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.698112 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.698752 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.707822 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.733358 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.733440 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-var-lock\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.733471 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bd0c383-7376-4e95-9919-863297cbd807-kube-api-access\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.834352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.834430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-var-lock\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.834459 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bd0c383-7376-4e95-9919-863297cbd807-kube-api-access\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.834889 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.834938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-var-lock\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:06 crc kubenswrapper[4772]: I0127 15:10:06.858029 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bd0c383-7376-4e95-9919-863297cbd807-kube-api-access\") pod \"installer-9-crc\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:07 crc kubenswrapper[4772]: I0127 15:10:07.039704 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.683741 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wcldz" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.784005 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.784395 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvxxl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-75jrg_openshift-marketplace(f637b998-b13b-486d-9042-4cd40a01c833): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.785566 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-75jrg" podUID="f637b998-b13b-486d-9042-4cd40a01c833" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.790273 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.790406 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fnhcz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-k7pfr_openshift-marketplace(d0b33686-8107-4caf-b67f-3c608119a049): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:09 crc kubenswrapper[4772]: E0127 15:10:09.791775 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-k7pfr" podUID="d0b33686-8107-4caf-b67f-3c608119a049" Jan 27 15:10:10 crc kubenswrapper[4772]: E0127 15:10:10.887120 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-75jrg" podUID="f637b998-b13b-486d-9042-4cd40a01c833" Jan 27 15:10:10 crc kubenswrapper[4772]: E0127 15:10:10.887125 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-k7pfr" podUID="d0b33686-8107-4caf-b67f-3c608119a049" Jan 27 15:10:10 crc kubenswrapper[4772]: E0127 15:10:10.949844 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 15:10:10 crc kubenswrapper[4772]: E0127 15:10:10.950024 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-67q8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jwrpk_openshift-marketplace(dd415ccf-2b4a-4797-962f-a464ef96bc22): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:10 crc kubenswrapper[4772]: E0127 15:10:10.951310 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jwrpk" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" Jan 27 15:10:12 crc kubenswrapper[4772]: I0127 15:10:12.058642 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:10:12 crc kubenswrapper[4772]: I0127 15:10:12.059026 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.287137 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jwrpk" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.526840 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.527136 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-87w6q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dfgjh_openshift-marketplace(8cbabfa8-79d8-4b23-b186-b40ba8b3017e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.528367 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dfgjh" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.666530 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.667307 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4zvrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-95rh9_openshift-marketplace(4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.668532 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-95rh9" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" Jan 27 15:10:13 crc kubenswrapper[4772]: I0127 15:10:13.726387 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 15:10:13 crc kubenswrapper[4772]: W0127 15:10:13.733596 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb62e8603_1857_4f92_9e4f_b3eab5be12ed.slice/crio-61dfd2fab37415adeb86c2e4199d19720dbf229c31a5352f47a4c6a31dab51bb WatchSource:0}: Error finding container 61dfd2fab37415adeb86c2e4199d19720dbf229c31a5352f47a4c6a31dab51bb: Status 404 returned error can't find the container with id 61dfd2fab37415adeb86c2e4199d19720dbf229c31a5352f47a4c6a31dab51bb Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.763315 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.763491 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7g6kk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xp8ph_openshift-marketplace(ac2b5800-ce98-4847-bfcd-67a97375aa1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.770973 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xp8ph" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" Jan 27 15:10:13 crc kubenswrapper[4772]: I0127 15:10:13.775257 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 15:10:13 crc kubenswrapper[4772]: I0127 15:10:13.786025 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ql2vx"] Jan 27 15:10:13 crc kubenswrapper[4772]: W0127 15:10:13.803086 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod371016c8_5a23_427d_aa0a_0faa241d86a7.slice/crio-39b983878e98594f803a2a5d18787411ce8df5bebcdb7feb44bab66559d18d93 WatchSource:0}: Error finding container 39b983878e98594f803a2a5d18787411ce8df5bebcdb7feb44bab66559d18d93: Status 404 returned error can't find the container with id 39b983878e98594f803a2a5d18787411ce8df5bebcdb7feb44bab66559d18d93 Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.870864 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.871381 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7fhkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9wdps_openshift-marketplace(96e88efd-1f25-4e44-b459-ab773db93656): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.872719 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9wdps" podUID="96e88efd-1f25-4e44-b459-ab773db93656" Jan 27 15:10:13 crc kubenswrapper[4772]: I0127 15:10:13.936896 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" event={"ID":"371016c8-5a23-427d-aa0a-0faa241d86a7","Type":"ContainerStarted","Data":"39b983878e98594f803a2a5d18787411ce8df5bebcdb7feb44bab66559d18d93"} Jan 27 15:10:13 crc kubenswrapper[4772]: I0127 15:10:13.938105 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b62e8603-1857-4f92-9e4f-b3eab5be12ed","Type":"ContainerStarted","Data":"61dfd2fab37415adeb86c2e4199d19720dbf229c31a5352f47a4c6a31dab51bb"} Jan 27 15:10:13 crc kubenswrapper[4772]: I0127 15:10:13.939673 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7bd0c383-7376-4e95-9919-863297cbd807","Type":"ContainerStarted","Data":"27a7557244977e8f16265740d543848969b7bcbc8b87db7409dc58f332f68492"} Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.941874 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dfgjh" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.943096 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-95rh9" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.943152 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xp8ph" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" Jan 27 15:10:13 crc kubenswrapper[4772]: E0127 15:10:13.943319 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9wdps" podUID="96e88efd-1f25-4e44-b459-ab773db93656" Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.951179 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" event={"ID":"371016c8-5a23-427d-aa0a-0faa241d86a7","Type":"ContainerStarted","Data":"4d8d625089c13d4ad5b6cee5beecca8d34e5a17f31b33f041a0a842cab85604c"} Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.951503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ql2vx" event={"ID":"371016c8-5a23-427d-aa0a-0faa241d86a7","Type":"ContainerStarted","Data":"bf3f4ce0b98e4dc2124a6a0ca30c113726e0a65db7e268a10567ef09bb041136"} Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.954709 4772 generic.go:334] "Generic (PLEG): container finished" podID="b62e8603-1857-4f92-9e4f-b3eab5be12ed" containerID="1f7d72c416be4fb3e74c5e402c3e0877f29d27571b78b1c7dc5eefbf979e7d8b" exitCode=0 Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.954865 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b62e8603-1857-4f92-9e4f-b3eab5be12ed","Type":"ContainerDied","Data":"1f7d72c416be4fb3e74c5e402c3e0877f29d27571b78b1c7dc5eefbf979e7d8b"} Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.956954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7bd0c383-7376-4e95-9919-863297cbd807","Type":"ContainerStarted","Data":"0eeaf2bcc5f54216d999847c8ecf3f795fa45776d35017701579dff468e8db9c"} Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.971470 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ql2vx" podStartSLOduration=178.971447894 podStartE2EDuration="2m58.971447894s" podCreationTimestamp="2026-01-27 15:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:10:14.96496303 +0000 UTC m=+200.945572128" watchObservedRunningTime="2026-01-27 15:10:14.971447894 +0000 UTC m=+200.952056992" Jan 27 15:10:14 crc kubenswrapper[4772]: I0127 15:10:14.992815 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.992796131 podStartE2EDuration="8.992796131s" podCreationTimestamp="2026-01-27 15:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:10:14.992220684 +0000 UTC m=+200.972829782" watchObservedRunningTime="2026-01-27 15:10:14.992796131 +0000 UTC m=+200.973405229" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.201403 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.272598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kube-api-access\") pod \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.272715 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kubelet-dir\") pod \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\" (UID: \"b62e8603-1857-4f92-9e4f-b3eab5be12ed\") " Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.272990 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b62e8603-1857-4f92-9e4f-b3eab5be12ed" (UID: "b62e8603-1857-4f92-9e4f-b3eab5be12ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.279363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b62e8603-1857-4f92-9e4f-b3eab5be12ed" (UID: "b62e8603-1857-4f92-9e4f-b3eab5be12ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.374309 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.374598 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b62e8603-1857-4f92-9e4f-b3eab5be12ed-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.972563 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b62e8603-1857-4f92-9e4f-b3eab5be12ed","Type":"ContainerDied","Data":"61dfd2fab37415adeb86c2e4199d19720dbf229c31a5352f47a4c6a31dab51bb"} Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.972610 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61dfd2fab37415adeb86c2e4199d19720dbf229c31a5352f47a4c6a31dab51bb" Jan 27 15:10:16 crc kubenswrapper[4772]: I0127 15:10:16.972670 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:10:25 crc kubenswrapper[4772]: I0127 15:10:25.015647 4772 generic.go:334] "Generic (PLEG): container finished" podID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerID="f473a9dfe0a18cc21db70fde482752286c2a5587ef7f32cf2144d6c58d4053e6" exitCode=0 Jan 27 15:10:25 crc kubenswrapper[4772]: I0127 15:10:25.016243 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcldz" event={"ID":"987488b4-af4d-4b20-bb26-f433d4d1299a","Type":"ContainerDied","Data":"f473a9dfe0a18cc21db70fde482752286c2a5587ef7f32cf2144d6c58d4053e6"} Jan 27 15:10:27 crc kubenswrapper[4772]: I0127 15:10:27.029409 4772 generic.go:334] "Generic (PLEG): container finished" podID="f637b998-b13b-486d-9042-4cd40a01c833" containerID="2c0b59089ec0a9f3a0a19eaaa3c657c89533289d4c824c8863bf9ca1e0f2856b" exitCode=0 Jan 27 15:10:27 crc kubenswrapper[4772]: I0127 15:10:27.029474 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75jrg" event={"ID":"f637b998-b13b-486d-9042-4cd40a01c833","Type":"ContainerDied","Data":"2c0b59089ec0a9f3a0a19eaaa3c657c89533289d4c824c8863bf9ca1e0f2856b"} Jan 27 15:10:27 crc kubenswrapper[4772]: I0127 15:10:27.033819 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcldz" event={"ID":"987488b4-af4d-4b20-bb26-f433d4d1299a","Type":"ContainerStarted","Data":"2524cad9551fd6cd1e12590205a9cfa2493995c09e8d8b75c74bdbd4fbd9dab8"} Jan 27 15:10:27 crc kubenswrapper[4772]: I0127 15:10:27.035697 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerStarted","Data":"35038329828ccd832c938fb7caf96024c35dc820e86dd3e9aadcf5ac8ef257b5"} Jan 27 15:10:27 crc kubenswrapper[4772]: I0127 15:10:27.095033 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wcldz" podStartSLOduration=3.849487516 podStartE2EDuration="1m3.095012728s" podCreationTimestamp="2026-01-27 15:09:24 +0000 UTC" firstStartedPulling="2026-01-27 15:09:26.435160755 +0000 UTC m=+152.415769863" lastFinishedPulling="2026-01-27 15:10:25.680685977 +0000 UTC m=+211.661295075" observedRunningTime="2026-01-27 15:10:27.092365043 +0000 UTC m=+213.072974161" watchObservedRunningTime="2026-01-27 15:10:27.095012728 +0000 UTC m=+213.075621826" Jan 27 15:10:28 crc kubenswrapper[4772]: I0127 15:10:28.042970 4772 generic.go:334] "Generic (PLEG): container finished" podID="d0b33686-8107-4caf-b67f-3c608119a049" containerID="35038329828ccd832c938fb7caf96024c35dc820e86dd3e9aadcf5ac8ef257b5" exitCode=0 Jan 27 15:10:28 crc kubenswrapper[4772]: I0127 15:10:28.043038 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerDied","Data":"35038329828ccd832c938fb7caf96024c35dc820e86dd3e9aadcf5ac8ef257b5"} Jan 27 15:10:34 crc kubenswrapper[4772]: I0127 15:10:34.478274 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:10:34 crc kubenswrapper[4772]: I0127 15:10:34.478343 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:10:35 crc kubenswrapper[4772]: I0127 15:10:35.570816 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:10:35 crc kubenswrapper[4772]: I0127 15:10:35.605433 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:10:35 crc kubenswrapper[4772]: I0127 15:10:35.800433 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wcldz"] Jan 27 15:10:37 crc kubenswrapper[4772]: I0127 15:10:37.090618 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wcldz" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="registry-server" containerID="cri-o://2524cad9551fd6cd1e12590205a9cfa2493995c09e8d8b75c74bdbd4fbd9dab8" gracePeriod=2 Jan 27 15:10:38 crc kubenswrapper[4772]: I0127 15:10:38.097595 4772 generic.go:334] "Generic (PLEG): container finished" podID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerID="2524cad9551fd6cd1e12590205a9cfa2493995c09e8d8b75c74bdbd4fbd9dab8" exitCode=0 Jan 27 15:10:38 crc kubenswrapper[4772]: I0127 15:10:38.097669 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcldz" event={"ID":"987488b4-af4d-4b20-bb26-f433d4d1299a","Type":"ContainerDied","Data":"2524cad9551fd6cd1e12590205a9cfa2493995c09e8d8b75c74bdbd4fbd9dab8"} Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.770927 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.852579 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-utilities\") pod \"987488b4-af4d-4b20-bb26-f433d4d1299a\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.852649 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-catalog-content\") pod \"987488b4-af4d-4b20-bb26-f433d4d1299a\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.852684 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9qx\" (UniqueName: \"kubernetes.io/projected/987488b4-af4d-4b20-bb26-f433d4d1299a-kube-api-access-sp9qx\") pod \"987488b4-af4d-4b20-bb26-f433d4d1299a\" (UID: \"987488b4-af4d-4b20-bb26-f433d4d1299a\") " Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.853750 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-utilities" (OuterVolumeSpecName: "utilities") pod "987488b4-af4d-4b20-bb26-f433d4d1299a" (UID: "987488b4-af4d-4b20-bb26-f433d4d1299a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.863403 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987488b4-af4d-4b20-bb26-f433d4d1299a-kube-api-access-sp9qx" (OuterVolumeSpecName: "kube-api-access-sp9qx") pod "987488b4-af4d-4b20-bb26-f433d4d1299a" (UID: "987488b4-af4d-4b20-bb26-f433d4d1299a"). InnerVolumeSpecName "kube-api-access-sp9qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.906360 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "987488b4-af4d-4b20-bb26-f433d4d1299a" (UID: "987488b4-af4d-4b20-bb26-f433d4d1299a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.954652 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.954992 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/987488b4-af4d-4b20-bb26-f433d4d1299a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:41 crc kubenswrapper[4772]: I0127 15:10:41.955004 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp9qx\" (UniqueName: \"kubernetes.io/projected/987488b4-af4d-4b20-bb26-f433d4d1299a-kube-api-access-sp9qx\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.058593 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.058669 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.058757 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.059412 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.059568 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19" gracePeriod=600 Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.137709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wcldz" event={"ID":"987488b4-af4d-4b20-bb26-f433d4d1299a","Type":"ContainerDied","Data":"cca259e2810aba1b9b6e47d70088b4b43bd08d57432f8fbb61e7ddcd0a7abb94"} Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.137769 4772 scope.go:117] "RemoveContainer" containerID="2524cad9551fd6cd1e12590205a9cfa2493995c09e8d8b75c74bdbd4fbd9dab8" Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.137901 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wcldz" Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.168785 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wcldz"] Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.168849 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wcldz"] Jan 27 15:10:42 crc kubenswrapper[4772]: I0127 15:10:42.670517 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" path="/var/lib/kubelet/pods/987488b4-af4d-4b20-bb26-f433d4d1299a/volumes" Jan 27 15:10:43 crc kubenswrapper[4772]: I0127 15:10:43.143745 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19" exitCode=0 Jan 27 15:10:43 crc kubenswrapper[4772]: I0127 15:10:43.143800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19"} Jan 27 15:10:44 crc kubenswrapper[4772]: I0127 15:10:44.541865 4772 scope.go:117] "RemoveContainer" containerID="f473a9dfe0a18cc21db70fde482752286c2a5587ef7f32cf2144d6c58d4053e6" Jan 27 15:10:46 crc kubenswrapper[4772]: I0127 15:10:46.520193 4772 scope.go:117] "RemoveContainer" containerID="05b1bd7fef5819af5a4449e0a65b00c0b94c405308dc0ea03120ac4091e22a7a" Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.176713 4772 generic.go:334] "Generic (PLEG): container finished" podID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerID="7958617c1c11ed30a9375bc289d465de0afd6e9db9de8e33b0d1ef509c9e2adb" exitCode=0 Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.176800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfgjh" event={"ID":"8cbabfa8-79d8-4b23-b186-b40ba8b3017e","Type":"ContainerDied","Data":"7958617c1c11ed30a9375bc289d465de0afd6e9db9de8e33b0d1ef509c9e2adb"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.185897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75jrg" event={"ID":"f637b998-b13b-486d-9042-4cd40a01c833","Type":"ContainerStarted","Data":"0085f838645429f4fe1db48da5f434fde0158d222d9039bcf74e2bc9cea6bf7f"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.187784 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerID="b837145fe89058e7be062b6cdd2bde2c15ab5a2a27d1b5c341bb196fff256ac4" exitCode=0 Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.187838 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xp8ph" event={"ID":"ac2b5800-ce98-4847-bfcd-67a97375aa1b","Type":"ContainerDied","Data":"b837145fe89058e7be062b6cdd2bde2c15ab5a2a27d1b5c341bb196fff256ac4"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.193139 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"8e72007caa5160368d39dc40b9c7f95a9beba3bef9f9e290eac1d112ef6eeb10"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.198409 4772 generic.go:334] "Generic (PLEG): container finished" podID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerID="a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8" exitCode=0 Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.198453 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rh9" event={"ID":"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7","Type":"ContainerDied","Data":"a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.204659 4772 generic.go:334] "Generic (PLEG): container finished" podID="96e88efd-1f25-4e44-b459-ab773db93656" containerID="7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6" exitCode=0 Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.204758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wdps" event={"ID":"96e88efd-1f25-4e44-b459-ab773db93656","Type":"ContainerDied","Data":"7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.207600 4772 generic.go:334] "Generic (PLEG): container finished" podID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerID="aff635156e5d675ca2fa44615b92754120942925bd9551591ea562f79911f6af" exitCode=0 Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.207652 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwrpk" event={"ID":"dd415ccf-2b4a-4797-962f-a464ef96bc22","Type":"ContainerDied","Data":"aff635156e5d675ca2fa44615b92754120942925bd9551591ea562f79911f6af"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.213515 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerStarted","Data":"1c52c9067d3a0dfe5bf38e17654a83a4d2211b850c4ac05e58ed278ee0de4d7e"} Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.221781 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-75jrg" podStartSLOduration=3.211485278 podStartE2EDuration="1m21.221756806s" podCreationTimestamp="2026-01-27 15:09:26 +0000 UTC" firstStartedPulling="2026-01-27 15:09:28.503281858 +0000 UTC m=+154.483890956" lastFinishedPulling="2026-01-27 15:10:46.513553386 +0000 UTC m=+232.494162484" observedRunningTime="2026-01-27 15:10:47.220307575 +0000 UTC m=+233.200916693" watchObservedRunningTime="2026-01-27 15:10:47.221756806 +0000 UTC m=+233.202365904" Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.293114 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k7pfr" podStartSLOduration=2.275063455 podStartE2EDuration="1m20.293098254s" podCreationTimestamp="2026-01-27 15:09:27 +0000 UTC" firstStartedPulling="2026-01-27 15:09:28.502325511 +0000 UTC m=+154.482934599" lastFinishedPulling="2026-01-27 15:10:46.5203603 +0000 UTC m=+232.500969398" observedRunningTime="2026-01-27 15:10:47.290427428 +0000 UTC m=+233.271036526" watchObservedRunningTime="2026-01-27 15:10:47.293098254 +0000 UTC m=+233.273707362" Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.392433 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:10:47 crc kubenswrapper[4772]: I0127 15:10:47.392658 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.223098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rh9" event={"ID":"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7","Type":"ContainerStarted","Data":"891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e"} Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.226646 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wdps" event={"ID":"96e88efd-1f25-4e44-b459-ab773db93656","Type":"ContainerStarted","Data":"321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d"} Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.229206 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwrpk" event={"ID":"dd415ccf-2b4a-4797-962f-a464ef96bc22","Type":"ContainerStarted","Data":"7cdcc72178b424cbd1356a3055a7eccdb609ddd782039c152e1c33a7dc48ebfd"} Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.231251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfgjh" event={"ID":"8cbabfa8-79d8-4b23-b186-b40ba8b3017e","Type":"ContainerStarted","Data":"932559e335376251fa378d1d6f007b100323207571225373c52e6683753426ad"} Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.243280 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95rh9" podStartSLOduration=2.927046524 podStartE2EDuration="1m25.243260121s" podCreationTimestamp="2026-01-27 15:09:23 +0000 UTC" firstStartedPulling="2026-01-27 15:09:25.379733918 +0000 UTC m=+151.360343006" lastFinishedPulling="2026-01-27 15:10:47.695947505 +0000 UTC m=+233.676556603" observedRunningTime="2026-01-27 15:10:48.240891624 +0000 UTC m=+234.221500722" watchObservedRunningTime="2026-01-27 15:10:48.243260121 +0000 UTC m=+234.223869219" Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.263415 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9wdps" podStartSLOduration=2.931746939 podStartE2EDuration="1m25.263396294s" podCreationTimestamp="2026-01-27 15:09:23 +0000 UTC" firstStartedPulling="2026-01-27 15:09:25.382702473 +0000 UTC m=+151.363311571" lastFinishedPulling="2026-01-27 15:10:47.714351828 +0000 UTC m=+233.694960926" observedRunningTime="2026-01-27 15:10:48.260376568 +0000 UTC m=+234.240985696" watchObservedRunningTime="2026-01-27 15:10:48.263396294 +0000 UTC m=+234.244005392" Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.280683 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dfgjh" podStartSLOduration=2.2119710599999998 podStartE2EDuration="1m22.280664564s" podCreationTimestamp="2026-01-27 15:09:26 +0000 UTC" firstStartedPulling="2026-01-27 15:09:27.495602569 +0000 UTC m=+153.476211667" lastFinishedPulling="2026-01-27 15:10:47.564296073 +0000 UTC m=+233.544905171" observedRunningTime="2026-01-27 15:10:48.278518013 +0000 UTC m=+234.259127131" watchObservedRunningTime="2026-01-27 15:10:48.280664564 +0000 UTC m=+234.261273662" Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.296786 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jwrpk" podStartSLOduration=2.957311848 podStartE2EDuration="1m25.296771232s" podCreationTimestamp="2026-01-27 15:09:23 +0000 UTC" firstStartedPulling="2026-01-27 15:09:25.387805521 +0000 UTC m=+151.368414619" lastFinishedPulling="2026-01-27 15:10:47.727264905 +0000 UTC m=+233.707874003" observedRunningTime="2026-01-27 15:10:48.292642735 +0000 UTC m=+234.273251833" watchObservedRunningTime="2026-01-27 15:10:48.296771232 +0000 UTC m=+234.277380330" Jan 27 15:10:48 crc kubenswrapper[4772]: I0127 15:10:48.439862 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k7pfr" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="registry-server" probeResult="failure" output=< Jan 27 15:10:48 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 15:10:48 crc kubenswrapper[4772]: > Jan 27 15:10:49 crc kubenswrapper[4772]: I0127 15:10:49.240073 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xp8ph" event={"ID":"ac2b5800-ce98-4847-bfcd-67a97375aa1b","Type":"ContainerStarted","Data":"f36658ad464a5804d99326d4821347a5bf28ef6ab12d7aacdef462094deb1db8"} Jan 27 15:10:49 crc kubenswrapper[4772]: I0127 15:10:49.262095 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xp8ph" podStartSLOduration=3.125388608 podStartE2EDuration="1m24.262076841s" podCreationTimestamp="2026-01-27 15:09:25 +0000 UTC" firstStartedPulling="2026-01-27 15:09:27.491286984 +0000 UTC m=+153.471896082" lastFinishedPulling="2026-01-27 15:10:48.627975217 +0000 UTC m=+234.608584315" observedRunningTime="2026-01-27 15:10:49.257226513 +0000 UTC m=+235.237835621" watchObservedRunningTime="2026-01-27 15:10:49.262076841 +0000 UTC m=+235.242685949" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.578105 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9wdps"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.578841 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9wdps" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="registry-server" containerID="cri-o://321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.589460 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-95rh9"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.589733 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-95rh9" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="registry-server" containerID="cri-o://891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.599775 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jwrpk"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.599980 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jwrpk" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="registry-server" containerID="cri-o://7cdcc72178b424cbd1356a3055a7eccdb609ddd782039c152e1c33a7dc48ebfd" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.606683 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4lj2h"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.606895 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" containerID="cri-o://8632589c7dbe4bb64d8d2a9e0983c8088c1ff445e316f1dd7c4e04e72fa148df" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.617897 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dfgjh"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.618141 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dfgjh" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="registry-server" containerID="cri-o://932559e335376251fa378d1d6f007b100323207571225373c52e6683753426ad" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.627578 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xp8ph"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.627840 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xp8ph" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="registry-server" containerID="cri-o://f36658ad464a5804d99326d4821347a5bf28ef6ab12d7aacdef462094deb1db8" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.638534 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2glnd"] Jan 27 15:10:50 crc kubenswrapper[4772]: E0127 15:10:50.638828 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="extract-utilities" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.638858 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="extract-utilities" Jan 27 15:10:50 crc kubenswrapper[4772]: E0127 15:10:50.638876 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="registry-server" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.638883 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="registry-server" Jan 27 15:10:50 crc kubenswrapper[4772]: E0127 15:10:50.638897 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="extract-content" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.638909 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="extract-content" Jan 27 15:10:50 crc kubenswrapper[4772]: E0127 15:10:50.638926 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b62e8603-1857-4f92-9e4f-b3eab5be12ed" containerName="pruner" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.638933 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b62e8603-1857-4f92-9e4f-b3eab5be12ed" containerName="pruner" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.639053 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="987488b4-af4d-4b20-bb26-f433d4d1299a" containerName="registry-server" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.639066 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b62e8603-1857-4f92-9e4f-b3eab5be12ed" containerName="pruner" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.639500 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.644281 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75jrg"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.644564 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-75jrg" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="registry-server" containerID="cri-o://0085f838645429f4fe1db48da5f434fde0158d222d9039bcf74e2bc9cea6bf7f" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.657703 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2glnd"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.661135 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k7pfr"] Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.666523 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k7pfr" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="registry-server" containerID="cri-o://1c52c9067d3a0dfe5bf38e17654a83a4d2211b850c4ac05e58ed278ee0de4d7e" gracePeriod=30 Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.671105 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqkv6\" (UniqueName: \"kubernetes.io/projected/d8591d45-25d0-47ea-a856-9cd5334e4a8c-kube-api-access-zqkv6\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.671160 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8591d45-25d0-47ea-a856-9cd5334e4a8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.671263 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8591d45-25d0-47ea-a856-9cd5334e4a8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.786360 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8591d45-25d0-47ea-a856-9cd5334e4a8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.787434 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqkv6\" (UniqueName: \"kubernetes.io/projected/d8591d45-25d0-47ea-a856-9cd5334e4a8c-kube-api-access-zqkv6\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.787507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8591d45-25d0-47ea-a856-9cd5334e4a8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.789129 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d8591d45-25d0-47ea-a856-9cd5334e4a8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.799836 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d8591d45-25d0-47ea-a856-9cd5334e4a8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.817078 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqkv6\" (UniqueName: \"kubernetes.io/projected/d8591d45-25d0-47ea-a856-9cd5334e4a8c-kube-api-access-zqkv6\") pod \"marketplace-operator-79b997595-2glnd\" (UID: \"d8591d45-25d0-47ea-a856-9cd5334e4a8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:50 crc kubenswrapper[4772]: I0127 15:10:50.962573 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.203543 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.204670 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.260661 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fgw98"] Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.269938 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7pfr_d0b33686-8107-4caf-b67f-3c608119a049/registry-server/0.log" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.280412 4772 generic.go:334] "Generic (PLEG): container finished" podID="d0b33686-8107-4caf-b67f-3c608119a049" containerID="1c52c9067d3a0dfe5bf38e17654a83a4d2211b850c4ac05e58ed278ee0de4d7e" exitCode=1 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.280483 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerDied","Data":"1c52c9067d3a0dfe5bf38e17654a83a4d2211b850c4ac05e58ed278ee0de4d7e"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.309585 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.310058 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75jrg_f637b998-b13b-486d-9042-4cd40a01c833/registry-server/0.log" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.329392 4772 generic.go:334] "Generic (PLEG): container finished" podID="f637b998-b13b-486d-9042-4cd40a01c833" containerID="0085f838645429f4fe1db48da5f434fde0158d222d9039bcf74e2bc9cea6bf7f" exitCode=1 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.329518 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75jrg" event={"ID":"f637b998-b13b-486d-9042-4cd40a01c833","Type":"ContainerDied","Data":"0085f838645429f4fe1db48da5f434fde0158d222d9039bcf74e2bc9cea6bf7f"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.349588 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerID="f36658ad464a5804d99326d4821347a5bf28ef6ab12d7aacdef462094deb1db8" exitCode=0 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.349853 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xp8ph" event={"ID":"ac2b5800-ce98-4847-bfcd-67a97375aa1b","Type":"ContainerDied","Data":"f36658ad464a5804d99326d4821347a5bf28ef6ab12d7aacdef462094deb1db8"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.353093 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75jrg_f637b998-b13b-486d-9042-4cd40a01c833/registry-server/0.log" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.356122 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.356629 4772 generic.go:334] "Generic (PLEG): container finished" podID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerID="891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e" exitCode=0 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.356766 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rh9" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.356805 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rh9" event={"ID":"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7","Type":"ContainerDied","Data":"891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.356864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rh9" event={"ID":"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7","Type":"ContainerDied","Data":"ae198d4139eb016b136b591cf513d4e1d588e78f4cc1966851a08fad44048adb"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.356887 4772 scope.go:117] "RemoveContainer" containerID="891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.370352 4772 generic.go:334] "Generic (PLEG): container finished" podID="96e88efd-1f25-4e44-b459-ab773db93656" containerID="321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d" exitCode=0 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.370410 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wdps" event={"ID":"96e88efd-1f25-4e44-b459-ab773db93656","Type":"ContainerDied","Data":"321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.370437 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9wdps" event={"ID":"96e88efd-1f25-4e44-b459-ab773db93656","Type":"ContainerDied","Data":"806bc56d016ac75a91f0a1effbd4a1494b65e83f8a957a34cd38d253ad927cc3"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.370513 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9wdps" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.370928 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7pfr_d0b33686-8107-4caf-b67f-3c608119a049/registry-server/0.log" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.371467 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.376576 4772 generic.go:334] "Generic (PLEG): container finished" podID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerID="7cdcc72178b424cbd1356a3055a7eccdb609ddd782039c152e1c33a7dc48ebfd" exitCode=0 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.376638 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwrpk" event={"ID":"dd415ccf-2b4a-4797-962f-a464ef96bc22","Type":"ContainerDied","Data":"7cdcc72178b424cbd1356a3055a7eccdb609ddd782039c152e1c33a7dc48ebfd"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.376710 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jwrpk" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.377457 4772 scope.go:117] "RemoveContainer" containerID="a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.388117 4772 generic.go:334] "Generic (PLEG): container finished" podID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerID="932559e335376251fa378d1d6f007b100323207571225373c52e6683753426ad" exitCode=0 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.388180 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfgjh" event={"ID":"8cbabfa8-79d8-4b23-b186-b40ba8b3017e","Type":"ContainerDied","Data":"932559e335376251fa378d1d6f007b100323207571225373c52e6683753426ad"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.389736 4772 generic.go:334] "Generic (PLEG): container finished" podID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerID="8632589c7dbe4bb64d8d2a9e0983c8088c1ff445e316f1dd7c4e04e72fa148df" exitCode=0 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.389762 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" event={"ID":"c8ebf890-c3b0-468e-bf7d-0ec590df084b","Type":"ContainerDied","Data":"8632589c7dbe4bb64d8d2a9e0983c8088c1ff445e316f1dd7c4e04e72fa148df"} Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.402688 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-utilities\") pod \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.402782 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-catalog-content\") pod \"96e88efd-1f25-4e44-b459-ab773db93656\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.402810 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvrk\" (UniqueName: \"kubernetes.io/projected/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-kube-api-access-4zvrk\") pod \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.402843 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fhkv\" (UniqueName: \"kubernetes.io/projected/96e88efd-1f25-4e44-b459-ab773db93656-kube-api-access-7fhkv\") pod \"96e88efd-1f25-4e44-b459-ab773db93656\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.402880 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-catalog-content\") pod \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\" (UID: \"4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.403021 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-utilities\") pod \"96e88efd-1f25-4e44-b459-ab773db93656\" (UID: \"96e88efd-1f25-4e44-b459-ab773db93656\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.404390 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-utilities" (OuterVolumeSpecName: "utilities") pod "96e88efd-1f25-4e44-b459-ab773db93656" (UID: "96e88efd-1f25-4e44-b459-ab773db93656"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.405626 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-utilities" (OuterVolumeSpecName: "utilities") pod "4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" (UID: "4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.421742 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e88efd-1f25-4e44-b459-ab773db93656-kube-api-access-7fhkv" (OuterVolumeSpecName: "kube-api-access-7fhkv") pod "96e88efd-1f25-4e44-b459-ab773db93656" (UID: "96e88efd-1f25-4e44-b459-ab773db93656"). InnerVolumeSpecName "kube-api-access-7fhkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.422786 4772 scope.go:117] "RemoveContainer" containerID="16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.453296 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-kube-api-access-4zvrk" (OuterVolumeSpecName: "kube-api-access-4zvrk") pod "4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" (UID: "4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7"). InnerVolumeSpecName "kube-api-access-4zvrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.473637 4772 scope.go:117] "RemoveContainer" containerID="891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.475281 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e\": container with ID starting with 891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e not found: ID does not exist" containerID="891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.475317 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e"} err="failed to get container status \"891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e\": rpc error: code = NotFound desc = could not find container \"891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e\": container with ID starting with 891a196fc4fbfd6fd254de7d8acfa840ad89166efe2aae3755959d4061df327e not found: ID does not exist" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.475343 4772 scope.go:117] "RemoveContainer" containerID="a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.477964 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8\": container with ID starting with a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8 not found: ID does not exist" containerID="a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.477999 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8"} err="failed to get container status \"a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8\": rpc error: code = NotFound desc = could not find container \"a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8\": container with ID starting with a2e75315afb920925e12db40d3166aec37363a00b9620db15e2cf976b7c362c8 not found: ID does not exist" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.478021 4772 scope.go:117] "RemoveContainer" containerID="16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.478292 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.482078 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd\": container with ID starting with 16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd not found: ID does not exist" containerID="16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.482124 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd"} err="failed to get container status \"16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd\": rpc error: code = NotFound desc = could not find container \"16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd\": container with ID starting with 16478a3399ae464f44f2335d6adb9e9f0cad87f3f55187da1906c5dcf87534dd not found: ID does not exist" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.482155 4772 scope.go:117] "RemoveContainer" containerID="321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.484053 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" (UID: "4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.487043 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509023 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-catalog-content\") pod \"d0b33686-8107-4caf-b67f-3c608119a049\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509082 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-utilities\") pod \"d0b33686-8107-4caf-b67f-3c608119a049\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509191 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67q8x\" (UniqueName: \"kubernetes.io/projected/dd415ccf-2b4a-4797-962f-a464ef96bc22-kube-api-access-67q8x\") pod \"dd415ccf-2b4a-4797-962f-a464ef96bc22\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509252 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnhcz\" (UniqueName: \"kubernetes.io/projected/d0b33686-8107-4caf-b67f-3c608119a049-kube-api-access-fnhcz\") pod \"d0b33686-8107-4caf-b67f-3c608119a049\" (UID: \"d0b33686-8107-4caf-b67f-3c608119a049\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509277 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-utilities\") pod \"dd415ccf-2b4a-4797-962f-a464ef96bc22\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509318 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-catalog-content\") pod \"f637b998-b13b-486d-9042-4cd40a01c833\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509348 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-catalog-content\") pod \"dd415ccf-2b4a-4797-962f-a464ef96bc22\" (UID: \"dd415ccf-2b4a-4797-962f-a464ef96bc22\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509380 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvxxl\" (UniqueName: \"kubernetes.io/projected/f637b998-b13b-486d-9042-4cd40a01c833-kube-api-access-vvxxl\") pod \"f637b998-b13b-486d-9042-4cd40a01c833\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509418 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-utilities\") pod \"f637b998-b13b-486d-9042-4cd40a01c833\" (UID: \"f637b998-b13b-486d-9042-4cd40a01c833\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509660 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509677 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509690 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvrk\" (UniqueName: \"kubernetes.io/projected/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-kube-api-access-4zvrk\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509705 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fhkv\" (UniqueName: \"kubernetes.io/projected/96e88efd-1f25-4e44-b459-ab773db93656-kube-api-access-7fhkv\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.509718 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.523531 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-utilities" (OuterVolumeSpecName: "utilities") pod "f637b998-b13b-486d-9042-4cd40a01c833" (UID: "f637b998-b13b-486d-9042-4cd40a01c833"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.527825 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd415ccf-2b4a-4797-962f-a464ef96bc22-kube-api-access-67q8x" (OuterVolumeSpecName: "kube-api-access-67q8x") pod "dd415ccf-2b4a-4797-962f-a464ef96bc22" (UID: "dd415ccf-2b4a-4797-962f-a464ef96bc22"). InnerVolumeSpecName "kube-api-access-67q8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.532185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f637b998-b13b-486d-9042-4cd40a01c833-kube-api-access-vvxxl" (OuterVolumeSpecName: "kube-api-access-vvxxl") pod "f637b998-b13b-486d-9042-4cd40a01c833" (UID: "f637b998-b13b-486d-9042-4cd40a01c833"). InnerVolumeSpecName "kube-api-access-vvxxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.532298 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b33686-8107-4caf-b67f-3c608119a049-kube-api-access-fnhcz" (OuterVolumeSpecName: "kube-api-access-fnhcz") pod "d0b33686-8107-4caf-b67f-3c608119a049" (UID: "d0b33686-8107-4caf-b67f-3c608119a049"). InnerVolumeSpecName "kube-api-access-fnhcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.551328 4772 scope.go:117] "RemoveContainer" containerID="7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.581203 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-utilities" (OuterVolumeSpecName: "utilities") pod "dd415ccf-2b4a-4797-962f-a464ef96bc22" (UID: "dd415ccf-2b4a-4797-962f-a464ef96bc22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.592037 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96e88efd-1f25-4e44-b459-ab773db93656" (UID: "96e88efd-1f25-4e44-b459-ab773db93656"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.604262 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-utilities" (OuterVolumeSpecName: "utilities") pod "d0b33686-8107-4caf-b67f-3c608119a049" (UID: "d0b33686-8107-4caf-b67f-3c608119a049"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.617497 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-catalog-content\") pod \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.617598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-utilities\") pod \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.617630 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvb9h\" (UniqueName: \"kubernetes.io/projected/c8ebf890-c3b0-468e-bf7d-0ec590df084b-kube-api-access-vvb9h\") pod \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.617659 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-operator-metrics\") pod \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.617765 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87w6q\" (UniqueName: \"kubernetes.io/projected/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-kube-api-access-87w6q\") pod \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\" (UID: \"8cbabfa8-79d8-4b23-b186-b40ba8b3017e\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.617810 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-trusted-ca\") pod \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\" (UID: \"c8ebf890-c3b0-468e-bf7d-0ec590df084b\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618031 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnhcz\" (UniqueName: \"kubernetes.io/projected/d0b33686-8107-4caf-b67f-3c608119a049-kube-api-access-fnhcz\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618046 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618058 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvxxl\" (UniqueName: \"kubernetes.io/projected/f637b998-b13b-486d-9042-4cd40a01c833-kube-api-access-vvxxl\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618070 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618083 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e88efd-1f25-4e44-b459-ab773db93656-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618094 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.618105 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67q8x\" (UniqueName: \"kubernetes.io/projected/dd415ccf-2b4a-4797-962f-a464ef96bc22-kube-api-access-67q8x\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.619334 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c8ebf890-c3b0-468e-bf7d-0ec590df084b" (UID: "c8ebf890-c3b0-468e-bf7d-0ec590df084b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.622029 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-utilities" (OuterVolumeSpecName: "utilities") pod "8cbabfa8-79d8-4b23-b186-b40ba8b3017e" (UID: "8cbabfa8-79d8-4b23-b186-b40ba8b3017e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.624670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8ebf890-c3b0-468e-bf7d-0ec590df084b-kube-api-access-vvb9h" (OuterVolumeSpecName: "kube-api-access-vvb9h") pod "c8ebf890-c3b0-468e-bf7d-0ec590df084b" (UID: "c8ebf890-c3b0-468e-bf7d-0ec590df084b"). InnerVolumeSpecName "kube-api-access-vvb9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.625998 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-kube-api-access-87w6q" (OuterVolumeSpecName: "kube-api-access-87w6q") pod "8cbabfa8-79d8-4b23-b186-b40ba8b3017e" (UID: "8cbabfa8-79d8-4b23-b186-b40ba8b3017e"). InnerVolumeSpecName "kube-api-access-87w6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.627393 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2glnd"] Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.630199 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd415ccf-2b4a-4797-962f-a464ef96bc22" (UID: "dd415ccf-2b4a-4797-962f-a464ef96bc22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.632389 4772 scope.go:117] "RemoveContainer" containerID="19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.638285 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c8ebf890-c3b0-468e-bf7d-0ec590df084b" (UID: "c8ebf890-c3b0-468e-bf7d-0ec590df084b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.660617 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.673494 4772 scope.go:117] "RemoveContainer" containerID="321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.674241 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d\": container with ID starting with 321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d not found: ID does not exist" containerID="321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.674283 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d"} err="failed to get container status \"321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d\": rpc error: code = NotFound desc = could not find container \"321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d\": container with ID starting with 321d45f45e6bfbd7d001b6cc4114cba4c366b13901f8014ac260d8f8a05b7c8d not found: ID does not exist" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.674313 4772 scope.go:117] "RemoveContainer" containerID="7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.680306 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6\": container with ID starting with 7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6 not found: ID does not exist" containerID="7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.680347 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6"} err="failed to get container status \"7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6\": rpc error: code = NotFound desc = could not find container \"7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6\": container with ID starting with 7127f22fb066327a7e4cfc38f52653579bcf1558a1847106ba8e7a3945fa02a6 not found: ID does not exist" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.680374 4772 scope.go:117] "RemoveContainer" containerID="19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.685473 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81\": container with ID starting with 19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81 not found: ID does not exist" containerID="19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.685629 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81"} err="failed to get container status \"19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81\": rpc error: code = NotFound desc = could not find container \"19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81\": container with ID starting with 19447f4e7cf721e9d74fba0b607574c2c8311dd4da0f4f4a1ba431d8c2a3ca81 not found: ID does not exist" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.685730 4772 scope.go:117] "RemoveContainer" containerID="7cdcc72178b424cbd1356a3055a7eccdb609ddd782039c152e1c33a7dc48ebfd" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.719036 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd415ccf-2b4a-4797-962f-a464ef96bc22-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.719096 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87w6q\" (UniqueName: \"kubernetes.io/projected/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-kube-api-access-87w6q\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.719109 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.719117 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.719128 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvb9h\" (UniqueName: \"kubernetes.io/projected/c8ebf890-c3b0-468e-bf7d-0ec590df084b-kube-api-access-vvb9h\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.719152 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c8ebf890-c3b0-468e-bf7d-0ec590df084b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.722737 4772 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.722900 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723303 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723332 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723355 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723364 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723375 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723383 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723390 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723399 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723411 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723420 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723433 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723443 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723455 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723463 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723473 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723481 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723494 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723501 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723508 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723515 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723525 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723532 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723541 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723548 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723558 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723565 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723577 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723585 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723595 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723602 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723615 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723623 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723633 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723642 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723654 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723663 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723675 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723683 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723693 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723700 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723711 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723718 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723731 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723739 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="extract-content" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723753 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723764 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723775 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723784 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723795 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723807 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723817 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723826 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723837 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723845 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723853 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723864 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723875 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723883 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: E0127 15:10:51.723893 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.723901 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="extract-utilities" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.726672 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08" gracePeriod=15 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.726895 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d" gracePeriod=15 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727064 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a" gracePeriod=15 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727143 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59" gracePeriod=15 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.726850 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727244 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727255 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727266 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" containerName="marketplace-operator" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727277 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e88efd-1f25-4e44-b459-ab773db93656" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727287 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727295 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727301 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f637b998-b13b-486d-9042-4cd40a01c833" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727308 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727317 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727325 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727333 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727341 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b33686-8107-4caf-b67f-3c608119a049" containerName="registry-server" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727352 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727526 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.727929 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b" gracePeriod=15 Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.728827 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cbabfa8-79d8-4b23-b186-b40ba8b3017e" (UID: "8cbabfa8-79d8-4b23-b186-b40ba8b3017e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.730808 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.734010 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.746123 4772 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.820519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-catalog-content\") pod \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.820975 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.821079 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-utilities\") pod \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.821253 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g6kk\" (UniqueName: \"kubernetes.io/projected/ac2b5800-ce98-4847-bfcd-67a97375aa1b-kube-api-access-7g6kk\") pod \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\" (UID: \"ac2b5800-ce98-4847-bfcd-67a97375aa1b\") " Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.821544 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cbabfa8-79d8-4b23-b186-b40ba8b3017e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.822567 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-utilities" (OuterVolumeSpecName: "utilities") pod "ac2b5800-ce98-4847-bfcd-67a97375aa1b" (UID: "ac2b5800-ce98-4847-bfcd-67a97375aa1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.828299 4772 scope.go:117] "RemoveContainer" containerID="aff635156e5d675ca2fa44615b92754120942925bd9551591ea562f79911f6af" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.845903 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2b5800-ce98-4847-bfcd-67a97375aa1b-kube-api-access-7g6kk" (OuterVolumeSpecName: "kube-api-access-7g6kk") pod "ac2b5800-ce98-4847-bfcd-67a97375aa1b" (UID: "ac2b5800-ce98-4847-bfcd-67a97375aa1b"). InnerVolumeSpecName "kube-api-access-7g6kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.853509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0b33686-8107-4caf-b67f-3c608119a049" (UID: "d0b33686-8107-4caf-b67f-3c608119a049"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.862204 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f637b998-b13b-486d-9042-4cd40a01c833" (UID: "f637b998-b13b-486d-9042-4cd40a01c833"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.882471 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac2b5800-ce98-4847-bfcd-67a97375aa1b" (UID: "ac2b5800-ce98-4847-bfcd-67a97375aa1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.904918 4772 scope.go:117] "RemoveContainer" containerID="759437d05ef598aec5d4669f7ffea07fc52730444984e390ed6235fe2f84e271" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.923947 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.923989 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924016 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924135 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924179 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924208 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924243 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924285 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924350 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924364 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac2b5800-ce98-4847-bfcd-67a97375aa1b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924376 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0b33686-8107-4caf-b67f-3c608119a049-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924390 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g6kk\" (UniqueName: \"kubernetes.io/projected/ac2b5800-ce98-4847-bfcd-67a97375aa1b-kube-api-access-7g6kk\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:51 crc kubenswrapper[4772]: I0127 15:10:51.924403 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f637b998-b13b-486d-9042-4cd40a01c833-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.013822 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.014193 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.014606 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.015007 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.015376 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.015641 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: E0127 15:10:52.016871 4772 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.134:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.025900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.025983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.025997 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026030 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026068 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026085 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026111 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026152 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026216 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026281 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026313 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026312 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.026390 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: E0127 15:10:52.066526 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.134:6443: connect: connection refused" event="&Event{ObjectMeta:{marketplace-operator-79b997595-2glnd.188e9f1d5cf52191 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-2glnd,UID:d8591d45-25d0-47ea-a856-9cd5334e4a8c,APIVersion:v1,ResourceVersion:29454,FieldPath:spec.containers{marketplace-operator},},Reason:Created,Message:Created container marketplace-operator,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:10:52.065620369 +0000 UTC m=+238.046229467,LastTimestamp:2026-01-27 15:10:52.065620369 +0000 UTC m=+238.046229467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.318123 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:52 crc kubenswrapper[4772]: W0127 15:10:52.332720 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-c8e5981d023622831660010c517a0b33cd1e36ddfcc17aac8b21646f3366db9e WatchSource:0}: Error finding container c8e5981d023622831660010c517a0b33cd1e36ddfcc17aac8b21646f3366db9e: Status 404 returned error can't find the container with id c8e5981d023622831660010c517a0b33cd1e36ddfcc17aac8b21646f3366db9e Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.409440 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.411852 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.413138 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08" exitCode=0 Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.413188 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a" exitCode=0 Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.413202 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d" exitCode=0 Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.413212 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59" exitCode=2 Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.413225 4772 scope.go:117] "RemoveContainer" containerID="a6f8b4fa9f839939910224ff95f7788a5cdb3f9ff233a0621e06efdad5c3fa67" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.415742 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-k7pfr_d0b33686-8107-4caf-b67f-3c608119a049/registry-server/0.log" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.416703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k7pfr" event={"ID":"d0b33686-8107-4caf-b67f-3c608119a049","Type":"ContainerDied","Data":"73b204aeb508c9b52f9299efc082b371c9a1b4cb36fb76b5cd3cb0d31f443821"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.416755 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k7pfr" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.417428 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.417735 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.418129 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.418361 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.419982 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jwrpk" event={"ID":"dd415ccf-2b4a-4797-962f-a464ef96bc22","Type":"ContainerDied","Data":"625e8aba305d47657203f1b68ee02b451e267b67921c5b887eaa601500155d6c"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.423076 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dfgjh" event={"ID":"8cbabfa8-79d8-4b23-b186-b40ba8b3017e","Type":"ContainerDied","Data":"a43ec03d3d30e58e4b1f455c9ffe4cf357362515c513668efc22ad8bedf315c3"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.423153 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dfgjh" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.423990 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.424397 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.424751 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.425385 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.425711 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.427014 4772 generic.go:334] "Generic (PLEG): container finished" podID="7bd0c383-7376-4e95-9919-863297cbd807" containerID="0eeaf2bcc5f54216d999847c8ecf3f795fa45776d35017701579dff468e8db9c" exitCode=0 Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.427063 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7bd0c383-7376-4e95-9919-863297cbd807","Type":"ContainerDied","Data":"0eeaf2bcc5f54216d999847c8ecf3f795fa45776d35017701579dff468e8db9c"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.428349 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.428625 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.428900 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.429144 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.429547 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.430027 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.431805 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/0.log" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.431875 4772 generic.go:334] "Generic (PLEG): container finished" podID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" containerID="473257dde7457f1aee0bf59bde3a2b0a19d1fd13941a08f767066393bccd2ab9" exitCode=1 Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.431979 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" event={"ID":"d8591d45-25d0-47ea-a856-9cd5334e4a8c","Type":"ContainerDied","Data":"473257dde7457f1aee0bf59bde3a2b0a19d1fd13941a08f767066393bccd2ab9"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.432014 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" event={"ID":"d8591d45-25d0-47ea-a856-9cd5334e4a8c","Type":"ContainerStarted","Data":"6d8062e53fa2b66b9595b12b076dd98687a7682c2e18c22690a2da9d229c8552"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.432068 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.432150 4772 scope.go:117] "RemoveContainer" containerID="473257dde7457f1aee0bf59bde3a2b0a19d1fd13941a08f767066393bccd2ab9" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.432377 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.432806 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.433211 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.433857 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.434515 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.435018 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.435204 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" event={"ID":"c8ebf890-c3b0-468e-bf7d-0ec590df084b","Type":"ContainerDied","Data":"267d22366b6b80c120159c7b29d573289ed71a1b2d51c437b57f97f84c344fdc"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.435273 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.435294 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.435568 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.436000 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.436378 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.436576 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.436762 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.437083 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.437340 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.437533 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.437812 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.438576 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.438775 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75jrg_f637b998-b13b-486d-9042-4cd40a01c833/registry-server/0.log" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.438806 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439020 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439199 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439413 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439528 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75jrg" event={"ID":"f637b998-b13b-486d-9042-4cd40a01c833","Type":"ContainerDied","Data":"e6520bb81fcb94747f96d0bde0c4f99cc40f51207c4752c3b45e79ef35a202b3"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439600 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75jrg" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439686 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.439967 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.440229 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.440442 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.440644 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.440830 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.441008 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.441291 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.441494 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.441597 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c8e5981d023622831660010c517a0b33cd1e36ddfcc17aac8b21646f3366db9e"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.441699 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.442234 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.442435 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.443771 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.444055 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.445074 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.445748 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xp8ph" event={"ID":"ac2b5800-ce98-4847-bfcd-67a97375aa1b","Type":"ContainerDied","Data":"d8c8cf461f8b99ac3badb881b8d4938b4c0d57c4110c08a8b732cf718594c112"} Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.445834 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xp8ph" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.446051 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.454405 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.455373 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.455577 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.455806 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.456204 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.456388 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.456598 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.456784 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.456964 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.457130 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.472619 4772 scope.go:117] "RemoveContainer" containerID="1c52c9067d3a0dfe5bf38e17654a83a4d2211b850c4ac05e58ed278ee0de4d7e" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.488388 4772 scope.go:117] "RemoveContainer" containerID="35038329828ccd832c938fb7caf96024c35dc820e86dd3e9aadcf5ac8ef257b5" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.503279 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.503445 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.503589 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.503727 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.503856 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.503985 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.504132 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.504309 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.504447 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.504579 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.504738 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.504926 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.505253 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.506160 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.506358 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.506550 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.506722 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.506893 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.507084 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.507349 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.514308 4772 scope.go:117] "RemoveContainer" containerID="142c7ce0c1e97146a8a91a5cb46adbe2ee5537497547ac756fc38abfd7afe96c" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.532057 4772 scope.go:117] "RemoveContainer" containerID="932559e335376251fa378d1d6f007b100323207571225373c52e6683753426ad" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.553630 4772 scope.go:117] "RemoveContainer" containerID="7958617c1c11ed30a9375bc289d465de0afd6e9db9de8e33b0d1ef509c9e2adb" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.575352 4772 scope.go:117] "RemoveContainer" containerID="e90d644c15f4a502d49563577bfa11dc77829d65c3be871a6762542c2dc18bcb" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.589508 4772 scope.go:117] "RemoveContainer" containerID="8632589c7dbe4bb64d8d2a9e0983c8088c1ff445e316f1dd7c4e04e72fa148df" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.604495 4772 scope.go:117] "RemoveContainer" containerID="0085f838645429f4fe1db48da5f434fde0158d222d9039bcf74e2bc9cea6bf7f" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.623023 4772 scope.go:117] "RemoveContainer" containerID="2c0b59089ec0a9f3a0a19eaaa3c657c89533289d4c824c8863bf9ca1e0f2856b" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.640306 4772 scope.go:117] "RemoveContainer" containerID="980163ea21706e95ea0803e7c47d4cf7427d498f1351e03fac04539f250bddc6" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.657783 4772 scope.go:117] "RemoveContainer" containerID="f36658ad464a5804d99326d4821347a5bf28ef6ab12d7aacdef462094deb1db8" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.671827 4772 scope.go:117] "RemoveContainer" containerID="b837145fe89058e7be062b6cdd2bde2c15ab5a2a27d1b5c341bb196fff256ac4" Jan 27 15:10:52 crc kubenswrapper[4772]: I0127 15:10:52.715069 4772 scope.go:117] "RemoveContainer" containerID="2b642630dd7f7b63f30ba841d8958c4ae79e62858aee5dff568bed444b47b036" Jan 27 15:10:52 crc kubenswrapper[4772]: E0127 15:10:52.760498 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.134:6443: connect: connection refused" event="&Event{ObjectMeta:{marketplace-operator-79b997595-2glnd.188e9f1d5cf52191 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-2glnd,UID:d8591d45-25d0-47ea-a856-9cd5334e4a8c,APIVersion:v1,ResourceVersion:29454,FieldPath:spec.containers{marketplace-operator},},Reason:Created,Message:Created container marketplace-operator,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:10:52.065620369 +0000 UTC m=+238.046229467,LastTimestamp:2026-01-27 15:10:52.065620369 +0000 UTC m=+238.046229467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.476918 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4"} Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.477440 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: E0127 15:10:53.477565 4772 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.134:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.477721 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.478249 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.478597 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.478875 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.479257 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.479482 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.479721 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.480050 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.480361 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.481128 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/1.log" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.481587 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/0.log" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.481631 4772 generic.go:334] "Generic (PLEG): container finished" podID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" exitCode=1 Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.481689 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" event={"ID":"d8591d45-25d0-47ea-a856-9cd5334e4a8c","Type":"ContainerDied","Data":"c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957"} Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.481792 4772 scope.go:117] "RemoveContainer" containerID="473257dde7457f1aee0bf59bde3a2b0a19d1fd13941a08f767066393bccd2ab9" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.482078 4772 scope.go:117] "RemoveContainer" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.482231 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: E0127 15:10:53.482287 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.482542 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.482820 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.483039 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.483308 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.483532 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.483760 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.484016 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.484331 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.484621 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.486299 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.703211 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.704270 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.704524 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.704836 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.705060 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.705312 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.705801 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.706393 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.706680 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.706908 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.707192 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.849243 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bd0c383-7376-4e95-9919-863297cbd807-kube-api-access\") pod \"7bd0c383-7376-4e95-9919-863297cbd807\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.849291 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-kubelet-dir\") pod \"7bd0c383-7376-4e95-9919-863297cbd807\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.849337 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-var-lock\") pod \"7bd0c383-7376-4e95-9919-863297cbd807\" (UID: \"7bd0c383-7376-4e95-9919-863297cbd807\") " Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.849605 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-var-lock" (OuterVolumeSpecName: "var-lock") pod "7bd0c383-7376-4e95-9919-863297cbd807" (UID: "7bd0c383-7376-4e95-9919-863297cbd807"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.849641 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7bd0c383-7376-4e95-9919-863297cbd807" (UID: "7bd0c383-7376-4e95-9919-863297cbd807"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.858616 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd0c383-7376-4e95-9919-863297cbd807-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7bd0c383-7376-4e95-9919-863297cbd807" (UID: "7bd0c383-7376-4e95-9919-863297cbd807"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.950373 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7bd0c383-7376-4e95-9919-863297cbd807-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.950646 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:53 crc kubenswrapper[4772]: I0127 15:10:53.950658 4772 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7bd0c383-7376-4e95-9919-863297cbd807-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.102394 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.103130 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.103740 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.104177 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.104451 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.104708 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.105202 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.105412 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.105575 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.105810 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.106063 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.106269 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.106498 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.253685 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.253838 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.253878 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.253912 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.253986 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.253992 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.254328 4772 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.254348 4772 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.254358 4772 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.495962 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7bd0c383-7376-4e95-9919-863297cbd807","Type":"ContainerDied","Data":"27a7557244977e8f16265740d543848969b7bcbc8b87db7409dc58f332f68492"} Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.496009 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27a7557244977e8f16265740d543848969b7bcbc8b87db7409dc58f332f68492" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.495988 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.503619 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/1.log" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.504551 4772 scope.go:117] "RemoveContainer" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.505094 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.505101 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.505502 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.506023 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.506588 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.506879 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.506948 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.507417 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.507737 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.508027 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.508398 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.508449 4772 scope.go:117] "RemoveContainer" containerID="d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.508490 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.508417 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b" exitCode=0 Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.508789 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.509093 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.509255 4772 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.134:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.509580 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.509868 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.510357 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.510656 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.510953 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.511448 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.511873 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.512404 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.512761 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.513204 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.513667 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.520550 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.520974 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.521333 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.521807 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.522179 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.522493 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.522923 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.523384 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.523557 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.523713 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.523991 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.532047 4772 scope.go:117] "RemoveContainer" containerID="ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.547028 4772 scope.go:117] "RemoveContainer" containerID="cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.558741 4772 scope.go:117] "RemoveContainer" containerID="013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.571971 4772 scope.go:117] "RemoveContainer" containerID="f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.585340 4772 scope.go:117] "RemoveContainer" containerID="6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.603075 4772 scope.go:117] "RemoveContainer" containerID="d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.606071 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\": container with ID starting with d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08 not found: ID does not exist" containerID="d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606104 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08"} err="failed to get container status \"d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\": rpc error: code = NotFound desc = could not find container \"d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08\": container with ID starting with d1c727e444b798a9f19bb20f2a43ab26b74c929e7fc72824b497ade9bcc2ac08 not found: ID does not exist" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606127 4772 scope.go:117] "RemoveContainer" containerID="ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.606409 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\": container with ID starting with ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a not found: ID does not exist" containerID="ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606433 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a"} err="failed to get container status \"ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\": rpc error: code = NotFound desc = could not find container \"ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a\": container with ID starting with ed0afec69304057b659922d98e91e37f07c44d3ad4b4e6e2e5633f394164ae4a not found: ID does not exist" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606449 4772 scope.go:117] "RemoveContainer" containerID="cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.606692 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\": container with ID starting with cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d not found: ID does not exist" containerID="cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606715 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d"} err="failed to get container status \"cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\": rpc error: code = NotFound desc = could not find container \"cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d\": container with ID starting with cbbc7ef9a19ac21602529b8c46914ec727c2c7517f03372ab9aa3c823d315f4d not found: ID does not exist" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606734 4772 scope.go:117] "RemoveContainer" containerID="013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.606951 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\": container with ID starting with 013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59 not found: ID does not exist" containerID="013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606974 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59"} err="failed to get container status \"013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\": rpc error: code = NotFound desc = could not find container \"013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59\": container with ID starting with 013829a83ab4749028d5a3020a9bab5621cef37de23ed39e25524caf3b340a59 not found: ID does not exist" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.606989 4772 scope.go:117] "RemoveContainer" containerID="f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.607189 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\": container with ID starting with f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b not found: ID does not exist" containerID="f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.607207 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b"} err="failed to get container status \"f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\": rpc error: code = NotFound desc = could not find container \"f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b\": container with ID starting with f8200e14f2cf3fdae1549435b9f62f8588bd36d23201077561bba97795684d9b not found: ID does not exist" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.607220 4772 scope.go:117] "RemoveContainer" containerID="6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.607411 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\": container with ID starting with 6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5 not found: ID does not exist" containerID="6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.607431 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5"} err="failed to get container status \"6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\": rpc error: code = NotFound desc = could not find container \"6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5\": container with ID starting with 6b3609eb7fbaf4e8741fc38683a8eb92729ac16475b035eb0476a9546b007bd5 not found: ID does not exist" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.666131 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.666471 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.666752 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.666975 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.667272 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.673320 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.673699 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.674941 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.675147 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.675445 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.675662 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.677374 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.938846 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.939118 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.939748 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.939945 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.940126 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:54 crc kubenswrapper[4772]: I0127 15:10:54.940158 4772 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 27 15:10:54 crc kubenswrapper[4772]: E0127 15:10:54.940353 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="200ms" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.141818 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="400ms" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.542493 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="800ms" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.950363 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:10:55Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:10:55Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:10:55Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:10:55Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:024b1ed0676c2e11f6a319392c82e7acd0ceeae31ca00b202307c4d86a796b20\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ada03173793960eaa0e4263282fcbf5af3dea8aaf2c3b0d864906108db062e8a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1672061160},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[],\\\"sizeBytes\\\":1201988853},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:19cccb48b9fd18a6ae02a77aeef83cf3d8e0bbde057c41c2a818afab51c859be\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:965650b0707047c6697952f57d2544e475608b828d1a3638867a50a7cdaf87b8\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1186361067},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:6d91aecdb391dd0cbb56f2b6335674bd2b4a25c63f0b9e893ba8977a71be3c0d\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:98739198606db13baf3fa39b12298669778a619dff80b9b5d51987d7f76056c9\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1180173538},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.951499 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.952029 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.952480 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.952955 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:10:55 crc kubenswrapper[4772]: E0127 15:10:55.952984 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:10:56 crc kubenswrapper[4772]: E0127 15:10:56.343219 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="1.6s" Jan 27 15:10:57 crc kubenswrapper[4772]: E0127 15:10:57.943977 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="3.2s" Jan 27 15:10:59 crc kubenswrapper[4772]: E0127 15:10:59.737268 4772 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.134:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" volumeName="registry-storage" Jan 27 15:11:00 crc kubenswrapper[4772]: I0127 15:11:00.963949 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:11:00 crc kubenswrapper[4772]: I0127 15:11:00.965023 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:11:00 crc kubenswrapper[4772]: I0127 15:11:00.966098 4772 scope.go:117] "RemoveContainer" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" Jan 27 15:11:00 crc kubenswrapper[4772]: E0127 15:11:00.966381 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:11:01 crc kubenswrapper[4772]: E0127 15:11:01.145152 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.134:6443: connect: connection refused" interval="6.4s" Jan 27 15:11:01 crc kubenswrapper[4772]: I0127 15:11:01.553921 4772 scope.go:117] "RemoveContainer" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" Jan 27 15:11:01 crc kubenswrapper[4772]: E0127 15:11:01.554310 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:11:02 crc kubenswrapper[4772]: E0127 15:11:02.761324 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.134:6443: connect: connection refused" event="&Event{ObjectMeta:{marketplace-operator-79b997595-2glnd.188e9f1d5cf52191 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-2glnd,UID:d8591d45-25d0-47ea-a856-9cd5334e4a8c,APIVersion:v1,ResourceVersion:29454,FieldPath:spec.containers{marketplace-operator},},Reason:Created,Message:Created container marketplace-operator,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:10:52.065620369 +0000 UTC m=+238.046229467,LastTimestamp:2026-01-27 15:10:52.065620369 +0000 UTC m=+238.046229467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.662752 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.666759 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.667315 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.667696 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.668038 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.668475 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.669256 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.669827 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.670471 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.672876 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.673590 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.674131 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.674595 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.675005 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.675516 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.676841 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.677220 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.677643 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.678240 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.678786 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.679648 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.697636 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.697669 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:04 crc kubenswrapper[4772]: E0127 15:11:04.698120 4772 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:04 crc kubenswrapper[4772]: I0127 15:11:04.698546 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.576615 4772 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d1f019d2285511058b91ca49aeeeb1f9b79aa95fe8f3a01d9d191fa07365d0b1" exitCode=0 Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.576728 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d1f019d2285511058b91ca49aeeeb1f9b79aa95fe8f3a01d9d191fa07365d0b1"} Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.576985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"487453b2cce3e10be13ca110bb948d070e993f627cca8085aaa057a5ea8d371e"} Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.577548 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.577564 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.578028 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: E0127 15:11:05.578062 4772 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.578284 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.578632 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.579027 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.579291 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.579574 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.579814 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.580157 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.580776 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.581121 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.581373 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.581434 4772 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c" exitCode=1 Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.581474 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c"} Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.582110 4772 scope.go:117] "RemoveContainer" containerID="573a9989fcd89f53d26d43e7b4c495cd9a4bbe98ebf1ed0a0dfd0e63875d5b8c" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.582184 4772 status_manager.go:851] "Failed to get status for pod" podUID="96e88efd-1f25-4e44-b459-ab773db93656" pod="openshift-marketplace/certified-operators-9wdps" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-9wdps\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.582761 4772 status_manager.go:851] "Failed to get status for pod" podUID="f637b998-b13b-486d-9042-4cd40a01c833" pod="openshift-marketplace/redhat-operators-75jrg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-75jrg\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.583239 4772 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.583586 4772 status_manager.go:851] "Failed to get status for pod" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" pod="openshift-marketplace/redhat-marketplace-xp8ph" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-xp8ph\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.583912 4772 status_manager.go:851] "Failed to get status for pod" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-2glnd\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.584211 4772 status_manager.go:851] "Failed to get status for pod" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" pod="openshift-marketplace/community-operators-95rh9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-95rh9\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.584722 4772 status_manager.go:851] "Failed to get status for pod" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" pod="openshift-marketplace/redhat-marketplace-dfgjh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dfgjh\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.585061 4772 status_manager.go:851] "Failed to get status for pod" podUID="d0b33686-8107-4caf-b67f-3c608119a049" pod="openshift-marketplace/redhat-operators-k7pfr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-k7pfr\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.585449 4772 status_manager.go:851] "Failed to get status for pod" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" pod="openshift-marketplace/marketplace-operator-79b997595-4lj2h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-4lj2h\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.585821 4772 status_manager.go:851] "Failed to get status for pod" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" pod="openshift-marketplace/community-operators-jwrpk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jwrpk\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:05 crc kubenswrapper[4772]: I0127 15:11:05.586097 4772 status_manager.go:851] "Failed to get status for pod" podUID="7bd0c383-7376-4e95-9919-863297cbd807" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.134:6443: connect: connection refused" Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.598383 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.598703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"13aecdb9184e78557877d3f0d60ede8ad7b6734ec979e1825d5e4d812b031a46"} Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603025 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"19e50c2fa00f22230547bd9cba2a6de870dac135d7760fc3b8c311abd01dc913"} Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe1a327d404ff115772dd864052396e082122483670a12515575f32b9375f496"} Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603083 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c38a0ebd0db1446c81c4731b72138976a46da7274531d03dbdb5286a6a7936ea"} Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603093 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9687b41b79ca543950ce1b639f10e2b19aea7dc77c2fa6716ac495daadca5b55"} Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603100 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5b3a1335d3569ec7885da4d368e7826c73c087ba95e7872fb2892ab273e7a880"} Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603301 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603368 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:06 crc kubenswrapper[4772]: I0127 15:11:06.603387 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:07 crc kubenswrapper[4772]: I0127 15:11:07.023728 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:11:07 crc kubenswrapper[4772]: I0127 15:11:07.029505 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:11:07 crc kubenswrapper[4772]: I0127 15:11:07.607204 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:11:09 crc kubenswrapper[4772]: I0127 15:11:09.699278 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:09 crc kubenswrapper[4772]: I0127 15:11:09.699639 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:09 crc kubenswrapper[4772]: I0127 15:11:09.704252 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:12 crc kubenswrapper[4772]: I0127 15:11:12.422712 4772 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:12 crc kubenswrapper[4772]: I0127 15:11:12.628907 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:12 crc kubenswrapper[4772]: I0127 15:11:12.628941 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:12 crc kubenswrapper[4772]: I0127 15:11:12.632937 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:13 crc kubenswrapper[4772]: I0127 15:11:13.632407 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:13 crc kubenswrapper[4772]: I0127 15:11:13.632438 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="11f71341-1cdc-430d-8d90-a87af2a493f1" Jan 27 15:11:14 crc kubenswrapper[4772]: I0127 15:11:14.674621 4772 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8fa90b23-89b2-4953-9e5c-23d515bc2224" Jan 27 15:11:15 crc kubenswrapper[4772]: I0127 15:11:15.664703 4772 scope.go:117] "RemoveContainer" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.312617 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerName="oauth-openshift" containerID="cri-o://437c578755bfcacf0145c1b3dcede3b1938b4e11e6ad9c7db9d8ac6a8b6df37e" gracePeriod=15 Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.652024 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/2.log" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.654520 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/1.log" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.654572 4772 generic.go:334] "Generic (PLEG): container finished" podID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" containerID="89be95f2c0621af885c1da302ff7e0b3e84dda04f9608b6e8af398107e1e9399" exitCode=1 Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.654630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" event={"ID":"d8591d45-25d0-47ea-a856-9cd5334e4a8c","Type":"ContainerDied","Data":"89be95f2c0621af885c1da302ff7e0b3e84dda04f9608b6e8af398107e1e9399"} Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.654749 4772 scope.go:117] "RemoveContainer" containerID="c22226245670f0b13e58561f751beb68f90896b450c95a9a05a5d73430ee3957" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.655658 4772 scope.go:117] "RemoveContainer" containerID="89be95f2c0621af885c1da302ff7e0b3e84dda04f9608b6e8af398107e1e9399" Jan 27 15:11:16 crc kubenswrapper[4772]: E0127 15:11:16.656015 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.656937 4772 generic.go:334] "Generic (PLEG): container finished" podID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerID="437c578755bfcacf0145c1b3dcede3b1938b4e11e6ad9c7db9d8ac6a8b6df37e" exitCode=0 Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.657012 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" event={"ID":"06cdc094-b372-4016-bc5e-4c15a28e032e","Type":"ContainerDied","Data":"437c578755bfcacf0145c1b3dcede3b1938b4e11e6ad9c7db9d8ac6a8b6df37e"} Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.657055 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" event={"ID":"06cdc094-b372-4016-bc5e-4c15a28e032e","Type":"ContainerDied","Data":"3982e848b6f4ab4d0d2958e425dd1a480bb7b8b136363856076bf9ce68e097fb"} Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.657075 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3982e848b6f4ab4d0d2958e425dd1a480bb7b8b136363856076bf9ce68e097fb" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.682371 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.839682 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.838282 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-cliconfig\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.839889 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-idp-0-file-data\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841132 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-trusted-ca-bundle\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841493 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-ocp-branding-template\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841564 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-session\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841617 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-policies\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841677 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-provider-selection\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841723 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-serving-cert\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841762 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-dir\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841825 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-service-ca\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841859 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.841962 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/06cdc094-b372-4016-bc5e-4c15a28e032e-kube-api-access-l5f6v\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842011 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-error\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842078 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-router-certs\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842115 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-login\") pod \"06cdc094-b372-4016-bc5e-4c15a28e032e\" (UID: \"06cdc094-b372-4016-bc5e-4c15a28e032e\") " Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842076 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842560 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842766 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842804 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842830 4772 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/06cdc094-b372-4016-bc5e-4c15a28e032e-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842855 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.842985 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.847069 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.848622 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.848916 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06cdc094-b372-4016-bc5e-4c15a28e032e-kube-api-access-l5f6v" (OuterVolumeSpecName: "kube-api-access-l5f6v") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "kube-api-access-l5f6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.849049 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.852049 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.852209 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.854532 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.855277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.859493 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "06cdc094-b372-4016-bc5e-4c15a28e032e" (UID: "06cdc094-b372-4016-bc5e-4c15a28e032e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943400 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5f6v\" (UniqueName: \"kubernetes.io/projected/06cdc094-b372-4016-bc5e-4c15a28e032e-kube-api-access-l5f6v\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943429 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943439 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943448 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943456 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943465 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943474 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943482 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943492 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:16 crc kubenswrapper[4772]: I0127 15:11:16.943503 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/06cdc094-b372-4016-bc5e-4c15a28e032e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:17 crc kubenswrapper[4772]: I0127 15:11:17.615345 4772 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-fgw98 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 15:11:17 crc kubenswrapper[4772]: I0127 15:11:17.615425 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 15:11:17 crc kubenswrapper[4772]: I0127 15:11:17.664361 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/2.log" Jan 27 15:11:17 crc kubenswrapper[4772]: I0127 15:11:17.664488 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fgw98" Jan 27 15:11:20 crc kubenswrapper[4772]: I0127 15:11:20.964151 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:11:20 crc kubenswrapper[4772]: I0127 15:11:20.964758 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:11:20 crc kubenswrapper[4772]: I0127 15:11:20.965640 4772 scope.go:117] "RemoveContainer" containerID="89be95f2c0621af885c1da302ff7e0b3e84dda04f9608b6e8af398107e1e9399" Jan 27 15:11:20 crc kubenswrapper[4772]: E0127 15:11:20.966121 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:11:22 crc kubenswrapper[4772]: I0127 15:11:22.082590 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 15:11:22 crc kubenswrapper[4772]: I0127 15:11:22.485101 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 15:11:22 crc kubenswrapper[4772]: I0127 15:11:22.643013 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 15:11:22 crc kubenswrapper[4772]: I0127 15:11:22.783862 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 15:11:23 crc kubenswrapper[4772]: I0127 15:11:23.226885 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 15:11:23 crc kubenswrapper[4772]: I0127 15:11:23.501532 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 15:11:23 crc kubenswrapper[4772]: I0127 15:11:23.862073 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 15:11:23 crc kubenswrapper[4772]: I0127 15:11:23.916778 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.099214 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.216571 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.364821 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.463046 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.610407 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.735616 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.874786 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.888307 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 15:11:24 crc kubenswrapper[4772]: I0127 15:11:24.967797 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.052508 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.136806 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.159949 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.296741 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.351823 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.385071 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.388723 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.398662 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.506273 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.619042 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.906679 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 15:11:25 crc kubenswrapper[4772]: I0127 15:11:25.912628 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.001666 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.054000 4772 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.068432 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.239238 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.359145 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.389409 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.394888 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.579143 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.642957 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.783988 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.838934 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.849643 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 15:11:26 crc kubenswrapper[4772]: I0127 15:11:26.950666 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.017913 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.030189 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.049930 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.249809 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.262598 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.286411 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.308484 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.316667 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.377741 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.397868 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.414360 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.453731 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.622724 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.678397 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.838518 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.902301 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.924390 4772 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.968679 4772 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.984353 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 15:11:27 crc kubenswrapper[4772]: I0127 15:11:27.990392 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.106421 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.113094 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.184221 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.196542 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.217276 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.314707 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.450594 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.494190 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.587310 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.695781 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.760340 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.824698 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.825087 4772 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.837194 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.853047 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.883801 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 15:11:28 crc kubenswrapper[4772]: I0127 15:11:28.900090 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.010415 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.146540 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.194445 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.298674 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.373946 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.395969 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.488963 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.515908 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.559087 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.597994 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.637976 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.699925 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 15:11:29 crc kubenswrapper[4772]: I0127 15:11:29.747878 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.050114 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.126822 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.213462 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.220569 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.226026 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.295887 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.296945 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.427872 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.434692 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.438553 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.465539 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.496036 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.566288 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.579846 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.580093 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.694267 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.747060 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.761444 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.784862 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.816135 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.867348 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 15:11:30 crc kubenswrapper[4772]: I0127 15:11:30.970914 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.029660 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.083394 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.136827 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.164289 4772 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.170248 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-75jrg","openshift-marketplace/community-operators-jwrpk","openshift-marketplace/redhat-marketplace-dfgjh","openshift-marketplace/marketplace-operator-79b997595-4lj2h","openshift-marketplace/redhat-operators-k7pfr","openshift-marketplace/redhat-marketplace-xp8ph","openshift-authentication/oauth-openshift-558db77b4-fgw98","openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/certified-operators-9wdps","openshift-marketplace/community-operators-95rh9"] Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.170354 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.174369 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.174396 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.192476 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.192458385 podStartE2EDuration="19.192458385s" podCreationTimestamp="2026-01-27 15:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:11:31.191643951 +0000 UTC m=+277.172253059" watchObservedRunningTime="2026-01-27 15:11:31.192458385 +0000 UTC m=+277.173067483" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.292548 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.325600 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.370133 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.472335 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.603223 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.675950 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.804479 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.896321 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.951358 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 15:11:31 crc kubenswrapper[4772]: I0127 15:11:31.982569 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.165492 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.173665 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.191954 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.259742 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.407566 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.426073 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.440510 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.477291 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.524091 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.602783 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.604476 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.655187 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.668407 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" path="/var/lib/kubelet/pods/06cdc094-b372-4016-bc5e-4c15a28e032e/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.669081 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7" path="/var/lib/kubelet/pods/4232ddc2-0fc9-45c7-b52a-ee96e2e3cef7/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.669730 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbabfa8-79d8-4b23-b186-b40ba8b3017e" path="/var/lib/kubelet/pods/8cbabfa8-79d8-4b23-b186-b40ba8b3017e/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.670782 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e88efd-1f25-4e44-b459-ab773db93656" path="/var/lib/kubelet/pods/96e88efd-1f25-4e44-b459-ab773db93656/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.671384 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2b5800-ce98-4847-bfcd-67a97375aa1b" path="/var/lib/kubelet/pods/ac2b5800-ce98-4847-bfcd-67a97375aa1b/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.672466 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8ebf890-c3b0-468e-bf7d-0ec590df084b" path="/var/lib/kubelet/pods/c8ebf890-c3b0-468e-bf7d-0ec590df084b/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.672910 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b33686-8107-4caf-b67f-3c608119a049" path="/var/lib/kubelet/pods/d0b33686-8107-4caf-b67f-3c608119a049/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.673563 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd415ccf-2b4a-4797-962f-a464ef96bc22" path="/var/lib/kubelet/pods/dd415ccf-2b4a-4797-962f-a464ef96bc22/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.674154 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.674633 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f637b998-b13b-486d-9042-4cd40a01c833" path="/var/lib/kubelet/pods/f637b998-b13b-486d-9042-4cd40a01c833/volumes" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.752578 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.767977 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.792162 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.875010 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.933234 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 15:11:32 crc kubenswrapper[4772]: I0127 15:11:32.970333 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.067784 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.191915 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.395245 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.395475 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.532335 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.616080 4772 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.627220 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.700959 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.747469 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.855534 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.857404 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 15:11:33 crc kubenswrapper[4772]: I0127 15:11:33.953785 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.024999 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.064452 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.175366 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.176002 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.198225 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.245625 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.246927 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.321025 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.438120 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.464725 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.501228 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.507607 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.572610 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.577603 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.585505 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.602096 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.666271 4772 scope.go:117] "RemoveContainer" containerID="89be95f2c0621af885c1da302ff7e0b3e84dda04f9608b6e8af398107e1e9399" Jan 27 15:11:34 crc kubenswrapper[4772]: E0127 15:11:34.666445 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-2glnd_openshift-marketplace(d8591d45-25d0-47ea-a856-9cd5334e4a8c)\"" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.696660 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.699647 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.725995 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.814743 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.842465 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:11:34 crc kubenswrapper[4772]: I0127 15:11:34.855760 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.017629 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.096913 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.105450 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.148686 4772 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.148910 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4" gracePeriod=5 Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.239092 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.278976 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.544712 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.656215 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.830433 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.909845 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 15:11:35 crc kubenswrapper[4772]: I0127 15:11:35.938775 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.021724 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.037533 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.143354 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.172355 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.197478 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.245631 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.301222 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.353943 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.399598 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.451068 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.690651 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698306 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7687c8778f-hqrqm"] Jan 27 15:11:36 crc kubenswrapper[4772]: E0127 15:11:36.698551 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698575 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 15:11:36 crc kubenswrapper[4772]: E0127 15:11:36.698596 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerName="oauth-openshift" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698604 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerName="oauth-openshift" Jan 27 15:11:36 crc kubenswrapper[4772]: E0127 15:11:36.698620 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd0c383-7376-4e95-9919-863297cbd807" containerName="installer" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698627 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd0c383-7376-4e95-9919-863297cbd807" containerName="installer" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698751 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd0c383-7376-4e95-9919-863297cbd807" containerName="installer" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698770 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.698782 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="06cdc094-b372-4016-bc5e-4c15a28e032e" containerName="oauth-openshift" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.699274 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.705553 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.706138 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.706343 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.706475 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.706589 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.707027 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.707854 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.708078 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.708255 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.708436 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.710586 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.705453 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.712884 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.713720 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7687c8778f-hqrqm"] Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.728279 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.732842 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782231 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782298 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-audit-policies\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782322 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-session\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782353 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-login\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782380 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782396 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b9xq\" (UniqueName: \"kubernetes.io/projected/4bcd7451-5572-4b94-a244-746f5c7145a2-kube-api-access-4b9xq\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-service-ca\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782442 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-error\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782463 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782483 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782498 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782516 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-router-certs\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.782565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bcd7451-5572-4b94-a244-746f5c7145a2-audit-dir\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884180 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-audit-policies\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884242 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-session\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884271 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-login\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884306 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884377 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b9xq\" (UniqueName: \"kubernetes.io/projected/4bcd7451-5572-4b94-a244-746f5c7145a2-kube-api-access-4b9xq\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884417 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-service-ca\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884460 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-error\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884487 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884516 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884542 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884571 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-router-certs\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884602 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bcd7451-5572-4b94-a244-746f5c7145a2-audit-dir\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.884659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.886880 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.886938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bcd7451-5572-4b94-a244-746f5c7145a2-audit-dir\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.886935 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-audit-policies\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.887432 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-service-ca\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.888229 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.890988 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.891543 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.891908 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-login\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.893544 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-router-certs\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.893876 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.903496 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-session\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.905370 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.905751 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bcd7451-5572-4b94-a244-746f5c7145a2-v4-0-config-user-template-error\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.906086 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.907987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b9xq\" (UniqueName: \"kubernetes.io/projected/4bcd7451-5572-4b94-a244-746f5c7145a2-kube-api-access-4b9xq\") pod \"oauth-openshift-7687c8778f-hqrqm\" (UID: \"4bcd7451-5572-4b94-a244-746f5c7145a2\") " pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:36 crc kubenswrapper[4772]: I0127 15:11:36.934663 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.032058 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.115534 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.137748 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.351375 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.361659 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.429470 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.452053 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7687c8778f-hqrqm"] Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.469864 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.565699 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.619666 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.656654 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.661004 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.766659 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" event={"ID":"4bcd7451-5572-4b94-a244-746f5c7145a2","Type":"ContainerStarted","Data":"7c7eff52d5c75bc856315b147ca80c5389960202b855ba1e1bbca7f221717152"} Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.766716 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" event={"ID":"4bcd7451-5572-4b94-a244-746f5c7145a2","Type":"ContainerStarted","Data":"da64ed2d13939519724584d6676d260ea792a791943df0b2581f3127f25a5a56"} Jan 27 15:11:37 crc kubenswrapper[4772]: I0127 15:11:37.816809 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.015773 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.024299 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.032560 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.089886 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.237057 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.245548 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.368687 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.772498 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.777856 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" Jan 27 15:11:38 crc kubenswrapper[4772]: I0127 15:11:38.803294 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7687c8778f-hqrqm" podStartSLOduration=47.803262461 podStartE2EDuration="47.803262461s" podCreationTimestamp="2026-01-27 15:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:11:37.794011996 +0000 UTC m=+283.774621114" watchObservedRunningTime="2026-01-27 15:11:38.803262461 +0000 UTC m=+284.783871599" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.166636 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.241750 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.363085 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.378753 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.443761 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.476843 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.898459 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 15:11:39 crc kubenswrapper[4772]: I0127 15:11:39.935300 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.179849 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.223691 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.723504 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.723632 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.785261 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.785318 4772 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4" exitCode=137 Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.785386 4772 scope.go:117] "RemoveContainer" containerID="6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.785386 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.800880 4772 scope.go:117] "RemoveContainer" containerID="6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4" Jan 27 15:11:40 crc kubenswrapper[4772]: E0127 15:11:40.801316 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4\": container with ID starting with 6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4 not found: ID does not exist" containerID="6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.801359 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4"} err="failed to get container status \"6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4\": rpc error: code = NotFound desc = could not find container \"6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4\": container with ID starting with 6f6210876fa329e0bc46c9fcfeb492e4200a121e5f183839eb54f27fb32b52f4 not found: ID does not exist" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.832780 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.832863 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.832873 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.832897 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.832941 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.832985 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833027 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833104 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833200 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833317 4772 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833334 4772 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833345 4772 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.833356 4772 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.841116 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:11:40 crc kubenswrapper[4772]: I0127 15:11:40.934891 4772 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:41 crc kubenswrapper[4772]: I0127 15:11:41.303499 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 15:11:42 crc kubenswrapper[4772]: I0127 15:11:42.669384 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.663237 4772 scope.go:117] "RemoveContainer" containerID="89be95f2c0621af885c1da302ff7e0b3e84dda04f9608b6e8af398107e1e9399" Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.824614 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/2.log" Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.824674 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" event={"ID":"d8591d45-25d0-47ea-a856-9cd5334e4a8c","Type":"ContainerStarted","Data":"fed559ed1ffbdad5f1dd35dfce6ce0006386e7520831ccdcc78b5bc1005d87b8"} Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.825202 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.826829 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2glnd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" start-of-body= Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.826894 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podUID="d8591d45-25d0-47ea-a856-9cd5334e4a8c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.56:8080/healthz\": dial tcp 10.217.0.56:8080: connect: connection refused" Jan 27 15:11:47 crc kubenswrapper[4772]: I0127 15:11:47.847729 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" podStartSLOduration=57.847712173 podStartE2EDuration="57.847712173s" podCreationTimestamp="2026-01-27 15:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:11:47.842144599 +0000 UTC m=+293.822753697" watchObservedRunningTime="2026-01-27 15:11:47.847712173 +0000 UTC m=+293.828321271" Jan 27 15:11:48 crc kubenswrapper[4772]: I0127 15:11:48.831838 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2glnd" Jan 27 15:11:54 crc kubenswrapper[4772]: I0127 15:11:54.465484 4772 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.069513 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6pclx"] Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.070344 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerName="controller-manager" containerID="cri-o://bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb" gracePeriod=30 Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.131577 4772 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6pclx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.131626 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.193083 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz"] Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.193437 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" podUID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" containerName="route-controller-manager" containerID="cri-o://3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f" gracePeriod=30 Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.414443 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.514108 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-serving-cert\") pod \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.514159 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-client-ca\") pod \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.514234 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-config\") pod \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.514389 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-proxy-ca-bundles\") pod \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.514429 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c7gp\" (UniqueName: \"kubernetes.io/projected/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-kube-api-access-8c7gp\") pod \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\" (UID: \"3dfd9a91-e760-4c80-96e6-ca6525aa86b8\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.514996 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "3dfd9a91-e760-4c80-96e6-ca6525aa86b8" (UID: "3dfd9a91-e760-4c80-96e6-ca6525aa86b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.515621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-config" (OuterVolumeSpecName: "config") pod "3dfd9a91-e760-4c80-96e6-ca6525aa86b8" (UID: "3dfd9a91-e760-4c80-96e6-ca6525aa86b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.516239 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3dfd9a91-e760-4c80-96e6-ca6525aa86b8" (UID: "3dfd9a91-e760-4c80-96e6-ca6525aa86b8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.516645 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.519561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3dfd9a91-e760-4c80-96e6-ca6525aa86b8" (UID: "3dfd9a91-e760-4c80-96e6-ca6525aa86b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.519931 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-kube-api-access-8c7gp" (OuterVolumeSpecName: "kube-api-access-8c7gp") pod "3dfd9a91-e760-4c80-96e6-ca6525aa86b8" (UID: "3dfd9a91-e760-4c80-96e6-ca6525aa86b8"). InnerVolumeSpecName "kube-api-access-8c7gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615502 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m24gk\" (UniqueName: \"kubernetes.io/projected/8d519648-7eaa-49bb-9a09-bd91d09d98c0-kube-api-access-m24gk\") pod \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615553 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-config\") pod \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615621 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d519648-7eaa-49bb-9a09-bd91d09d98c0-serving-cert\") pod \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615654 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-client-ca\") pod \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\" (UID: \"8d519648-7eaa-49bb-9a09-bd91d09d98c0\") " Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615827 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615837 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615847 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c7gp\" (UniqueName: \"kubernetes.io/projected/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-kube-api-access-8c7gp\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615855 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.615862 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dfd9a91-e760-4c80-96e6-ca6525aa86b8-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.616425 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d519648-7eaa-49bb-9a09-bd91d09d98c0" (UID: "8d519648-7eaa-49bb-9a09-bd91d09d98c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.616438 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-config" (OuterVolumeSpecName: "config") pod "8d519648-7eaa-49bb-9a09-bd91d09d98c0" (UID: "8d519648-7eaa-49bb-9a09-bd91d09d98c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.618093 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d519648-7eaa-49bb-9a09-bd91d09d98c0-kube-api-access-m24gk" (OuterVolumeSpecName: "kube-api-access-m24gk") pod "8d519648-7eaa-49bb-9a09-bd91d09d98c0" (UID: "8d519648-7eaa-49bb-9a09-bd91d09d98c0"). InnerVolumeSpecName "kube-api-access-m24gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.618332 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d519648-7eaa-49bb-9a09-bd91d09d98c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d519648-7eaa-49bb-9a09-bd91d09d98c0" (UID: "8d519648-7eaa-49bb-9a09-bd91d09d98c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.717037 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m24gk\" (UniqueName: \"kubernetes.io/projected/8d519648-7eaa-49bb-9a09-bd91d09d98c0-kube-api-access-m24gk\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.717091 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.717112 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d519648-7eaa-49bb-9a09-bd91d09d98c0-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.717131 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d519648-7eaa-49bb-9a09-bd91d09d98c0-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.867584 4772 generic.go:334] "Generic (PLEG): container finished" podID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerID="bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb" exitCode=0 Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.867630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" event={"ID":"3dfd9a91-e760-4c80-96e6-ca6525aa86b8","Type":"ContainerDied","Data":"bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb"} Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.867675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" event={"ID":"3dfd9a91-e760-4c80-96e6-ca6525aa86b8","Type":"ContainerDied","Data":"380eb12e1295d0270de3d27b76c2692262e194cd8678145268c2765050e2b23e"} Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.867695 4772 scope.go:117] "RemoveContainer" containerID="bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.867640 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6pclx" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.869565 4772 generic.go:334] "Generic (PLEG): container finished" podID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" containerID="3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f" exitCode=0 Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.869613 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" event={"ID":"8d519648-7eaa-49bb-9a09-bd91d09d98c0","Type":"ContainerDied","Data":"3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f"} Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.869651 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" event={"ID":"8d519648-7eaa-49bb-9a09-bd91d09d98c0","Type":"ContainerDied","Data":"326c33fb529962a602f8dfd5dbe7dcbd0ebb132fe4709244f27812007b261a68"} Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.869727 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.884862 4772 scope.go:117] "RemoveContainer" containerID="bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb" Jan 27 15:11:55 crc kubenswrapper[4772]: E0127 15:11:55.885358 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb\": container with ID starting with bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb not found: ID does not exist" containerID="bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.885413 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb"} err="failed to get container status \"bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb\": rpc error: code = NotFound desc = could not find container \"bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb\": container with ID starting with bb01029e32299fb52d56d061afc654aa573880622860ea04caf54ad26b9a84eb not found: ID does not exist" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.885448 4772 scope.go:117] "RemoveContainer" containerID="3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.903643 4772 scope.go:117] "RemoveContainer" containerID="3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.903800 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6pclx"] Jan 27 15:11:55 crc kubenswrapper[4772]: E0127 15:11:55.904557 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f\": container with ID starting with 3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f not found: ID does not exist" containerID="3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.904611 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f"} err="failed to get container status \"3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f\": rpc error: code = NotFound desc = could not find container \"3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f\": container with ID starting with 3506ad5b58f850018e8ca4a14f82aae1d0b2f9ec52328d0669f3f49efb696d0f not found: ID does not exist" Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.906873 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6pclx"] Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.915263 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz"] Jan 27 15:11:55 crc kubenswrapper[4772]: I0127 15:11:55.918109 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-r9glz"] Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.670239 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" path="/var/lib/kubelet/pods/3dfd9a91-e760-4c80-96e6-ca6525aa86b8/volumes" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.670781 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" path="/var/lib/kubelet/pods/8d519648-7eaa-49bb-9a09-bd91d09d98c0/volumes" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.707661 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq"] Jan 27 15:11:56 crc kubenswrapper[4772]: E0127 15:11:56.708100 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerName="controller-manager" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.708125 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerName="controller-manager" Jan 27 15:11:56 crc kubenswrapper[4772]: E0127 15:11:56.708139 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" containerName="route-controller-manager" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.708181 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" containerName="route-controller-manager" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.708348 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dfd9a91-e760-4c80-96e6-ca6525aa86b8" containerName="controller-manager" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.708372 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d519648-7eaa-49bb-9a09-bd91d09d98c0" containerName="route-controller-manager" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.709001 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.711681 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-whh87"] Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.712106 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.712823 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.718498 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.718731 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.718785 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.718731 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719217 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719245 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719368 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719508 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719733 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-whh87"] Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719772 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.719788 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.722977 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.735472 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.743229 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq"] Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.835096 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4vbvz"] Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.836272 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837480 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5748bb4e-846d-457a-af17-b1d6f0a36431-serving-cert\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837526 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-config\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837555 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-client-ca\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837593 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-client-ca\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837615 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-serving-cert\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837637 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-proxy-ca-bundles\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837668 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-799x8\" (UniqueName: \"kubernetes.io/projected/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-kube-api-access-799x8\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837692 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-config\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.837711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkgwr\" (UniqueName: \"kubernetes.io/projected/5748bb4e-846d-457a-af17-b1d6f0a36431-kube-api-access-fkgwr\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.839342 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.853262 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4vbvz"] Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.938927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-config\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.939406 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-client-ca\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.939556 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-catalog-content\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.939643 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-client-ca\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.939732 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-serving-cert\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.939854 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csq5c\" (UniqueName: \"kubernetes.io/projected/20e4371a-8bd2-4405-bb18-861923bfd37e-kube-api-access-csq5c\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.939964 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-proxy-ca-bundles\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.940139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-799x8\" (UniqueName: \"kubernetes.io/projected/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-kube-api-access-799x8\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.940303 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-config\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.940408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkgwr\" (UniqueName: \"kubernetes.io/projected/5748bb4e-846d-457a-af17-b1d6f0a36431-kube-api-access-fkgwr\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.940564 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5748bb4e-846d-457a-af17-b1d6f0a36431-serving-cert\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.942371 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-utilities\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.942408 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-config\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.940756 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-client-ca\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.941817 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-config\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.940756 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-client-ca\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.941634 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5748bb4e-846d-457a-af17-b1d6f0a36431-proxy-ca-bundles\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.945276 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5748bb4e-846d-457a-af17-b1d6f0a36431-serving-cert\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.945734 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-serving-cert\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.960324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkgwr\" (UniqueName: \"kubernetes.io/projected/5748bb4e-846d-457a-af17-b1d6f0a36431-kube-api-access-fkgwr\") pod \"controller-manager-9d7ff4cd6-9jdqq\" (UID: \"5748bb4e-846d-457a-af17-b1d6f0a36431\") " pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:56 crc kubenswrapper[4772]: I0127 15:11:56.960404 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-799x8\" (UniqueName: \"kubernetes.io/projected/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-kube-api-access-799x8\") pod \"route-controller-manager-55985dff9-whh87\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.032639 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dfvcs"] Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.033769 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.036767 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.037833 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.041281 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfvcs"] Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.046450 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.047123 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-utilities\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.047238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-catalog-content\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.047282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csq5c\" (UniqueName: \"kubernetes.io/projected/20e4371a-8bd2-4405-bb18-861923bfd37e-kube-api-access-csq5c\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.049643 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-utilities\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.049703 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-catalog-content\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.080503 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csq5c\" (UniqueName: \"kubernetes.io/projected/20e4371a-8bd2-4405-bb18-861923bfd37e-kube-api-access-csq5c\") pod \"community-operators-4vbvz\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.164979 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.165940 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881b071c-048c-4f66-96e7-fd1f91ca23f8-catalog-content\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.165969 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvh7\" (UniqueName: \"kubernetes.io/projected/881b071c-048c-4f66-96e7-fd1f91ca23f8-kube-api-access-tfvh7\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.166006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881b071c-048c-4f66-96e7-fd1f91ca23f8-utilities\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.267220 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881b071c-048c-4f66-96e7-fd1f91ca23f8-utilities\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.267652 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881b071c-048c-4f66-96e7-fd1f91ca23f8-catalog-content\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.267688 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvh7\" (UniqueName: \"kubernetes.io/projected/881b071c-048c-4f66-96e7-fd1f91ca23f8-kube-api-access-tfvh7\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.268615 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881b071c-048c-4f66-96e7-fd1f91ca23f8-utilities\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.268878 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881b071c-048c-4f66-96e7-fd1f91ca23f8-catalog-content\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.276460 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-whh87"] Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.292041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvh7\" (UniqueName: \"kubernetes.io/projected/881b071c-048c-4f66-96e7-fd1f91ca23f8-kube-api-access-tfvh7\") pod \"certified-operators-dfvcs\" (UID: \"881b071c-048c-4f66-96e7-fd1f91ca23f8\") " pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.332386 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq"] Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.362319 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4vbvz"] Jan 27 15:11:57 crc kubenswrapper[4772]: W0127 15:11:57.369551 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e4371a_8bd2_4405_bb18_861923bfd37e.slice/crio-f541cf85cdae422e49f9b3df32c8dd0416ecc007711177dc8c1d7d7680929e43 WatchSource:0}: Error finding container f541cf85cdae422e49f9b3df32c8dd0416ecc007711177dc8c1d7d7680929e43: Status 404 returned error can't find the container with id f541cf85cdae422e49f9b3df32c8dd0416ecc007711177dc8c1d7d7680929e43 Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.383313 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.606912 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfvcs"] Jan 27 15:11:57 crc kubenswrapper[4772]: W0127 15:11:57.655240 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881b071c_048c_4f66_96e7_fd1f91ca23f8.slice/crio-fb988077c86d56b0ecd59d7c83bf17183c0cb096ca767c2c7c33983cb62018dd WatchSource:0}: Error finding container fb988077c86d56b0ecd59d7c83bf17183c0cb096ca767c2c7c33983cb62018dd: Status 404 returned error can't find the container with id fb988077c86d56b0ecd59d7c83bf17183c0cb096ca767c2c7c33983cb62018dd Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.887559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" event={"ID":"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11","Type":"ContainerStarted","Data":"d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.887620 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" event={"ID":"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11","Type":"ContainerStarted","Data":"fe97b2d600f6d6cc4091b116b23735caf871c4e2a0580b348ce94443cc624fcf"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.887937 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.888893 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvcs" event={"ID":"881b071c-048c-4f66-96e7-fd1f91ca23f8","Type":"ContainerStarted","Data":"fe3fede8510b6c4bb0f1ffacbe3d0ec0288f88b5a6f5f6a793e5f3dd2eeff90b"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.888934 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvcs" event={"ID":"881b071c-048c-4f66-96e7-fd1f91ca23f8","Type":"ContainerStarted","Data":"fb988077c86d56b0ecd59d7c83bf17183c0cb096ca767c2c7c33983cb62018dd"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.891115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" event={"ID":"5748bb4e-846d-457a-af17-b1d6f0a36431","Type":"ContainerStarted","Data":"b33e07e1bbdb8f68554172eecced3999514bd01390467f8165d94835e2937303"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.891188 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" event={"ID":"5748bb4e-846d-457a-af17-b1d6f0a36431","Type":"ContainerStarted","Data":"702d8de62fe8dceb9e42a32a6d4bbb6520eefa3da0fef6914772f56056d17cc4"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.891834 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.893638 4772 generic.go:334] "Generic (PLEG): container finished" podID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerID="a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e" exitCode=0 Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.893672 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vbvz" event={"ID":"20e4371a-8bd2-4405-bb18-861923bfd37e","Type":"ContainerDied","Data":"a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.893701 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vbvz" event={"ID":"20e4371a-8bd2-4405-bb18-861923bfd37e","Type":"ContainerStarted","Data":"f541cf85cdae422e49f9b3df32c8dd0416ecc007711177dc8c1d7d7680929e43"} Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.895797 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.904875 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.919443 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" podStartSLOduration=2.919420792 podStartE2EDuration="2.919420792s" podCreationTimestamp="2026-01-27 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:11:57.916687461 +0000 UTC m=+303.897296559" watchObservedRunningTime="2026-01-27 15:11:57.919420792 +0000 UTC m=+303.900029890" Jan 27 15:11:57 crc kubenswrapper[4772]: I0127 15:11:57.945141 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9d7ff4cd6-9jdqq" podStartSLOduration=2.945102248 podStartE2EDuration="2.945102248s" podCreationTimestamp="2026-01-27 15:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:11:57.940247955 +0000 UTC m=+303.920857053" watchObservedRunningTime="2026-01-27 15:11:57.945102248 +0000 UTC m=+303.925711346" Jan 27 15:11:58 crc kubenswrapper[4772]: I0127 15:11:58.903208 4772 generic.go:334] "Generic (PLEG): container finished" podID="881b071c-048c-4f66-96e7-fd1f91ca23f8" containerID="fe3fede8510b6c4bb0f1ffacbe3d0ec0288f88b5a6f5f6a793e5f3dd2eeff90b" exitCode=0 Jan 27 15:11:58 crc kubenswrapper[4772]: I0127 15:11:58.903407 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvcs" event={"ID":"881b071c-048c-4f66-96e7-fd1f91ca23f8","Type":"ContainerDied","Data":"fe3fede8510b6c4bb0f1ffacbe3d0ec0288f88b5a6f5f6a793e5f3dd2eeff90b"} Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.238325 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f5shj"] Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.239940 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.242279 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.249912 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5shj"] Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.412869 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcc284-e96b-4605-a428-176ca549eeb2-utilities\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.412937 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzc2r\" (UniqueName: \"kubernetes.io/projected/72dcc284-e96b-4605-a428-176ca549eeb2-kube-api-access-pzc2r\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.413044 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcc284-e96b-4605-a428-176ca549eeb2-catalog-content\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.435380 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5whzm"] Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.436655 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.440206 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.447915 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5whzm"] Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.514146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzc2r\" (UniqueName: \"kubernetes.io/projected/72dcc284-e96b-4605-a428-176ca549eeb2-kube-api-access-pzc2r\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.514254 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcc284-e96b-4605-a428-176ca549eeb2-catalog-content\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.514296 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcc284-e96b-4605-a428-176ca549eeb2-utilities\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.515073 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcc284-e96b-4605-a428-176ca549eeb2-utilities\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.515079 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcc284-e96b-4605-a428-176ca549eeb2-catalog-content\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.534148 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzc2r\" (UniqueName: \"kubernetes.io/projected/72dcc284-e96b-4605-a428-176ca549eeb2-kube-api-access-pzc2r\") pod \"redhat-marketplace-f5shj\" (UID: \"72dcc284-e96b-4605-a428-176ca549eeb2\") " pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.562926 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.616124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr9pf\" (UniqueName: \"kubernetes.io/projected/79b85747-dcbc-462d-85d1-3d00801b5106-kube-api-access-vr9pf\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.623766 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-catalog-content\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.623868 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-utilities\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.725547 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-catalog-content\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.725600 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-utilities\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.725667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr9pf\" (UniqueName: \"kubernetes.io/projected/79b85747-dcbc-462d-85d1-3d00801b5106-kube-api-access-vr9pf\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.726911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-catalog-content\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.728869 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-utilities\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.744310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr9pf\" (UniqueName: \"kubernetes.io/projected/79b85747-dcbc-462d-85d1-3d00801b5106-kube-api-access-vr9pf\") pod \"redhat-operators-5whzm\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.750884 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.910890 4772 generic.go:334] "Generic (PLEG): container finished" podID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerID="3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c" exitCode=0 Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.911713 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vbvz" event={"ID":"20e4371a-8bd2-4405-bb18-861923bfd37e","Type":"ContainerDied","Data":"3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c"} Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.956720 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5whzm"] Jan 27 15:11:59 crc kubenswrapper[4772]: W0127 15:11:59.960910 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b85747_dcbc_462d_85d1_3d00801b5106.slice/crio-8e4bc519d0ced9952d6857ff31015675ce4705ed12ce2547ebdba49c33fc4d62 WatchSource:0}: Error finding container 8e4bc519d0ced9952d6857ff31015675ce4705ed12ce2547ebdba49c33fc4d62: Status 404 returned error can't find the container with id 8e4bc519d0ced9952d6857ff31015675ce4705ed12ce2547ebdba49c33fc4d62 Jan 27 15:11:59 crc kubenswrapper[4772]: I0127 15:11:59.965081 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f5shj"] Jan 27 15:11:59 crc kubenswrapper[4772]: W0127 15:11:59.966101 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dcc284_e96b_4605_a428_176ca549eeb2.slice/crio-55e61e49763a06e11c87abfce2babaac3a236dfbde7e77c897ef5f83bb000985 WatchSource:0}: Error finding container 55e61e49763a06e11c87abfce2babaac3a236dfbde7e77c897ef5f83bb000985: Status 404 returned error can't find the container with id 55e61e49763a06e11c87abfce2babaac3a236dfbde7e77c897ef5f83bb000985 Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.916803 4772 generic.go:334] "Generic (PLEG): container finished" podID="79b85747-dcbc-462d-85d1-3d00801b5106" containerID="d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1" exitCode=0 Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.916916 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5whzm" event={"ID":"79b85747-dcbc-462d-85d1-3d00801b5106","Type":"ContainerDied","Data":"d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1"} Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.917262 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5whzm" event={"ID":"79b85747-dcbc-462d-85d1-3d00801b5106","Type":"ContainerStarted","Data":"8e4bc519d0ced9952d6857ff31015675ce4705ed12ce2547ebdba49c33fc4d62"} Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.920527 4772 generic.go:334] "Generic (PLEG): container finished" podID="881b071c-048c-4f66-96e7-fd1f91ca23f8" containerID="42303aea9268a5a7664e56d12a4e4ea8c3524b3434d07f5f965bd4218953e4a5" exitCode=0 Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.921015 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvcs" event={"ID":"881b071c-048c-4f66-96e7-fd1f91ca23f8","Type":"ContainerDied","Data":"42303aea9268a5a7664e56d12a4e4ea8c3524b3434d07f5f965bd4218953e4a5"} Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.924937 4772 generic.go:334] "Generic (PLEG): container finished" podID="72dcc284-e96b-4605-a428-176ca549eeb2" containerID="a30e7f4b8f02ced5a831edb30ea63fa4ce910f9dd257e80f6edbb087e09ccd37" exitCode=0 Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.925008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5shj" event={"ID":"72dcc284-e96b-4605-a428-176ca549eeb2","Type":"ContainerDied","Data":"a30e7f4b8f02ced5a831edb30ea63fa4ce910f9dd257e80f6edbb087e09ccd37"} Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.925042 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5shj" event={"ID":"72dcc284-e96b-4605-a428-176ca549eeb2","Type":"ContainerStarted","Data":"55e61e49763a06e11c87abfce2babaac3a236dfbde7e77c897ef5f83bb000985"} Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.927419 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vbvz" event={"ID":"20e4371a-8bd2-4405-bb18-861923bfd37e","Type":"ContainerStarted","Data":"facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f"} Jan 27 15:12:00 crc kubenswrapper[4772]: I0127 15:12:00.970459 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4vbvz" podStartSLOduration=2.334685243 podStartE2EDuration="4.970440495s" podCreationTimestamp="2026-01-27 15:11:56 +0000 UTC" firstStartedPulling="2026-01-27 15:11:57.894706354 +0000 UTC m=+303.875315452" lastFinishedPulling="2026-01-27 15:12:00.530461606 +0000 UTC m=+306.511070704" observedRunningTime="2026-01-27 15:12:00.967327213 +0000 UTC m=+306.947936321" watchObservedRunningTime="2026-01-27 15:12:00.970440495 +0000 UTC m=+306.951049593" Jan 27 15:12:01 crc kubenswrapper[4772]: I0127 15:12:01.939025 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfvcs" event={"ID":"881b071c-048c-4f66-96e7-fd1f91ca23f8","Type":"ContainerStarted","Data":"bebeaa0da80cc73fcfa950011bea7962115e29aeaf6343634912c781ccb9d6a8"} Jan 27 15:12:01 crc kubenswrapper[4772]: I0127 15:12:01.949257 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5shj" event={"ID":"72dcc284-e96b-4605-a428-176ca549eeb2","Type":"ContainerStarted","Data":"317f0c9fa8dd2e57bedd32005f0d5d616e9ee8287f35270140e4402ef9d2fd3c"} Jan 27 15:12:01 crc kubenswrapper[4772]: I0127 15:12:01.959068 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dfvcs" podStartSLOduration=2.312772231 podStartE2EDuration="4.959047383s" podCreationTimestamp="2026-01-27 15:11:57 +0000 UTC" firstStartedPulling="2026-01-27 15:11:58.982752311 +0000 UTC m=+304.963361409" lastFinishedPulling="2026-01-27 15:12:01.629027473 +0000 UTC m=+307.609636561" observedRunningTime="2026-01-27 15:12:01.957867069 +0000 UTC m=+307.938476177" watchObservedRunningTime="2026-01-27 15:12:01.959047383 +0000 UTC m=+307.939656491" Jan 27 15:12:02 crc kubenswrapper[4772]: I0127 15:12:02.955599 4772 generic.go:334] "Generic (PLEG): container finished" podID="79b85747-dcbc-462d-85d1-3d00801b5106" containerID="eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7" exitCode=0 Jan 27 15:12:02 crc kubenswrapper[4772]: I0127 15:12:02.955733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5whzm" event={"ID":"79b85747-dcbc-462d-85d1-3d00801b5106","Type":"ContainerDied","Data":"eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7"} Jan 27 15:12:02 crc kubenswrapper[4772]: I0127 15:12:02.958585 4772 generic.go:334] "Generic (PLEG): container finished" podID="72dcc284-e96b-4605-a428-176ca549eeb2" containerID="317f0c9fa8dd2e57bedd32005f0d5d616e9ee8287f35270140e4402ef9d2fd3c" exitCode=0 Jan 27 15:12:02 crc kubenswrapper[4772]: I0127 15:12:02.959383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5shj" event={"ID":"72dcc284-e96b-4605-a428-176ca549eeb2","Type":"ContainerDied","Data":"317f0c9fa8dd2e57bedd32005f0d5d616e9ee8287f35270140e4402ef9d2fd3c"} Jan 27 15:12:02 crc kubenswrapper[4772]: I0127 15:12:02.959406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f5shj" event={"ID":"72dcc284-e96b-4605-a428-176ca549eeb2","Type":"ContainerStarted","Data":"03bb2fbce9122be2ed850d1a52f257693a41c3bbdd07883284ed1f5f69d9a222"} Jan 27 15:12:02 crc kubenswrapper[4772]: I0127 15:12:02.994077 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f5shj" podStartSLOduration=2.38753925 podStartE2EDuration="3.994059738s" podCreationTimestamp="2026-01-27 15:11:59 +0000 UTC" firstStartedPulling="2026-01-27 15:12:00.926748008 +0000 UTC m=+306.907357106" lastFinishedPulling="2026-01-27 15:12:02.533268506 +0000 UTC m=+308.513877594" observedRunningTime="2026-01-27 15:12:02.991297997 +0000 UTC m=+308.971907115" watchObservedRunningTime="2026-01-27 15:12:02.994059738 +0000 UTC m=+308.974668836" Jan 27 15:12:03 crc kubenswrapper[4772]: I0127 15:12:03.965900 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5whzm" event={"ID":"79b85747-dcbc-462d-85d1-3d00801b5106","Type":"ContainerStarted","Data":"a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c"} Jan 27 15:12:03 crc kubenswrapper[4772]: I0127 15:12:03.988784 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5whzm" podStartSLOduration=2.571040494 podStartE2EDuration="4.988761915s" podCreationTimestamp="2026-01-27 15:11:59 +0000 UTC" firstStartedPulling="2026-01-27 15:12:00.919189605 +0000 UTC m=+306.899798703" lastFinishedPulling="2026-01-27 15:12:03.336911036 +0000 UTC m=+309.317520124" observedRunningTime="2026-01-27 15:12:03.986339294 +0000 UTC m=+309.966948452" watchObservedRunningTime="2026-01-27 15:12:03.988761915 +0000 UTC m=+309.969371013" Jan 27 15:12:07 crc kubenswrapper[4772]: I0127 15:12:07.165693 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:12:07 crc kubenswrapper[4772]: I0127 15:12:07.166092 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:12:07 crc kubenswrapper[4772]: I0127 15:12:07.212261 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:12:07 crc kubenswrapper[4772]: I0127 15:12:07.384538 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:12:07 crc kubenswrapper[4772]: I0127 15:12:07.384582 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:12:07 crc kubenswrapper[4772]: I0127 15:12:07.425904 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:12:08 crc kubenswrapper[4772]: I0127 15:12:08.025196 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 15:12:08 crc kubenswrapper[4772]: I0127 15:12:08.025605 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dfvcs" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.563331 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.563386 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.609945 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2dw59"] Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.610583 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.623388 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2dw59"] Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.628608 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.751161 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.751247 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.760715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-trusted-ca\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.760791 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-bound-sa-token\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.760916 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-registry-tls\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.761124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.761227 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-registry-certificates\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.761263 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.761282 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.761314 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6cr\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-kube-api-access-cm6cr\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.786745 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.791663 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.862968 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-registry-certificates\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.863464 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.863528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6cr\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-kube-api-access-cm6cr\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.863573 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-trusted-ca\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.863626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-bound-sa-token\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.863669 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-registry-tls\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.863701 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.864163 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.864654 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-registry-certificates\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.865318 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-trusted-ca\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.869536 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-registry-tls\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.869534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.882902 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-bound-sa-token\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.883524 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6cr\" (UniqueName: \"kubernetes.io/projected/566c3aef-cf83-4a7b-a77e-774a9dfb90a6-kube-api-access-cm6cr\") pod \"image-registry-66df7c8f76-2dw59\" (UID: \"566c3aef-cf83-4a7b-a77e-774a9dfb90a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:09 crc kubenswrapper[4772]: I0127 15:12:09.944290 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:10 crc kubenswrapper[4772]: I0127 15:12:10.099828 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 15:12:10 crc kubenswrapper[4772]: I0127 15:12:10.099899 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f5shj" Jan 27 15:12:10 crc kubenswrapper[4772]: I0127 15:12:10.433938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-2dw59"] Jan 27 15:12:11 crc kubenswrapper[4772]: I0127 15:12:11.008426 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" event={"ID":"566c3aef-cf83-4a7b-a77e-774a9dfb90a6","Type":"ContainerStarted","Data":"bc3cce03415945747e4cc390ca99cce1ebbbb8af445807ab4be29b39e3748ccb"} Jan 27 15:12:12 crc kubenswrapper[4772]: I0127 15:12:12.016041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" event={"ID":"566c3aef-cf83-4a7b-a77e-774a9dfb90a6","Type":"ContainerStarted","Data":"110f202c0cb81a69a4644066f56a9baff303141c914ee46b5b9318a554daa0e9"} Jan 27 15:12:13 crc kubenswrapper[4772]: I0127 15:12:13.021310 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:13 crc kubenswrapper[4772]: I0127 15:12:13.042145 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" podStartSLOduration=4.042128471 podStartE2EDuration="4.042128471s" podCreationTimestamp="2026-01-27 15:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:12:13.040262566 +0000 UTC m=+319.020871684" watchObservedRunningTime="2026-01-27 15:12:13.042128471 +0000 UTC m=+319.022737569" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.069105 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-whh87"] Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.069342 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" podUID="ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" containerName="route-controller-manager" containerID="cri-o://d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858" gracePeriod=30 Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.553307 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.669728 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-config\") pod \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.669798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-799x8\" (UniqueName: \"kubernetes.io/projected/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-kube-api-access-799x8\") pod \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.669824 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-serving-cert\") pod \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.669888 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-client-ca\") pod \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\" (UID: \"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11\") " Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.670557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-client-ca" (OuterVolumeSpecName: "client-ca") pod "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" (UID: "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.670580 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-config" (OuterVolumeSpecName: "config") pod "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" (UID: "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.674914 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-kube-api-access-799x8" (OuterVolumeSpecName: "kube-api-access-799x8") pod "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" (UID: "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11"). InnerVolumeSpecName "kube-api-access-799x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.675048 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" (UID: "ce1b4cd5-019c-41b7-a994-e98eb4fd3b11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.770943 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.770980 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.770990 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-799x8\" (UniqueName: \"kubernetes.io/projected/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-kube-api-access-799x8\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:15 crc kubenswrapper[4772]: I0127 15:12:15.771001 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.047763 4772 generic.go:334] "Generic (PLEG): container finished" podID="ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" containerID="d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858" exitCode=0 Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.047821 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" event={"ID":"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11","Type":"ContainerDied","Data":"d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858"} Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.047854 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" event={"ID":"ce1b4cd5-019c-41b7-a994-e98eb4fd3b11","Type":"ContainerDied","Data":"fe97b2d600f6d6cc4091b116b23735caf871c4e2a0580b348ce94443cc624fcf"} Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.047878 4772 scope.go:117] "RemoveContainer" containerID="d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.048021 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55985dff9-whh87" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.070387 4772 scope.go:117] "RemoveContainer" containerID="d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858" Jan 27 15:12:16 crc kubenswrapper[4772]: E0127 15:12:16.071223 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858\": container with ID starting with d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858 not found: ID does not exist" containerID="d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.071251 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858"} err="failed to get container status \"d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858\": rpc error: code = NotFound desc = could not find container \"d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858\": container with ID starting with d424ca2f80c2d25b1c4a792e0cbbcc57ee25bfd64b22c04930189bc1f4e34858 not found: ID does not exist" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.079699 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-whh87"] Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.084285 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55985dff9-whh87"] Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.670593 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" path="/var/lib/kubelet/pods/ce1b4cd5-019c-41b7-a994-e98eb4fd3b11/volumes" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.724550 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8"] Jan 27 15:12:16 crc kubenswrapper[4772]: E0127 15:12:16.724753 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" containerName="route-controller-manager" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.724765 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" containerName="route-controller-manager" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.724851 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1b4cd5-019c-41b7-a994-e98eb4fd3b11" containerName="route-controller-manager" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.725220 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.728222 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.728872 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.729013 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.728907 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.729451 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.729329 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.736740 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8"] Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.883812 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ebccc5-7e14-42c0-8324-9acee056f8a7-serving-cert\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.883875 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhdcw\" (UniqueName: \"kubernetes.io/projected/f7ebccc5-7e14-42c0-8324-9acee056f8a7-kube-api-access-bhdcw\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.883918 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ebccc5-7e14-42c0-8324-9acee056f8a7-config\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.883939 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7ebccc5-7e14-42c0-8324-9acee056f8a7-client-ca\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.985323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ebccc5-7e14-42c0-8324-9acee056f8a7-serving-cert\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.985373 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhdcw\" (UniqueName: \"kubernetes.io/projected/f7ebccc5-7e14-42c0-8324-9acee056f8a7-kube-api-access-bhdcw\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.985413 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ebccc5-7e14-42c0-8324-9acee056f8a7-config\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.985434 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7ebccc5-7e14-42c0-8324-9acee056f8a7-client-ca\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.986328 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7ebccc5-7e14-42c0-8324-9acee056f8a7-client-ca\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.986931 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7ebccc5-7e14-42c0-8324-9acee056f8a7-config\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:16 crc kubenswrapper[4772]: I0127 15:12:16.989583 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7ebccc5-7e14-42c0-8324-9acee056f8a7-serving-cert\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:17 crc kubenswrapper[4772]: I0127 15:12:17.007796 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhdcw\" (UniqueName: \"kubernetes.io/projected/f7ebccc5-7e14-42c0-8324-9acee056f8a7-kube-api-access-bhdcw\") pod \"route-controller-manager-66f698ffcb-hdfc8\" (UID: \"f7ebccc5-7e14-42c0-8324-9acee056f8a7\") " pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:17 crc kubenswrapper[4772]: I0127 15:12:17.068886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:17 crc kubenswrapper[4772]: I0127 15:12:17.456711 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8"] Jan 27 15:12:17 crc kubenswrapper[4772]: W0127 15:12:17.461786 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ebccc5_7e14_42c0_8324_9acee056f8a7.slice/crio-1127780f119b177be5380b18f8676f014ff7a27500f8d274643e4193e7cfa2f7 WatchSource:0}: Error finding container 1127780f119b177be5380b18f8676f014ff7a27500f8d274643e4193e7cfa2f7: Status 404 returned error can't find the container with id 1127780f119b177be5380b18f8676f014ff7a27500f8d274643e4193e7cfa2f7 Jan 27 15:12:18 crc kubenswrapper[4772]: I0127 15:12:18.059580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" event={"ID":"f7ebccc5-7e14-42c0-8324-9acee056f8a7","Type":"ContainerStarted","Data":"033b0040a797aa93a999c71c60e265daf0395c42e378608e62e4905533abc35e"} Jan 27 15:12:18 crc kubenswrapper[4772]: I0127 15:12:18.059991 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:18 crc kubenswrapper[4772]: I0127 15:12:18.060035 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" event={"ID":"f7ebccc5-7e14-42c0-8324-9acee056f8a7","Type":"ContainerStarted","Data":"1127780f119b177be5380b18f8676f014ff7a27500f8d274643e4193e7cfa2f7"} Jan 27 15:12:18 crc kubenswrapper[4772]: I0127 15:12:18.081447 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" podStartSLOduration=3.0814304359999998 podStartE2EDuration="3.081430436s" podCreationTimestamp="2026-01-27 15:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:12:18.080353284 +0000 UTC m=+324.060962412" watchObservedRunningTime="2026-01-27 15:12:18.081430436 +0000 UTC m=+324.062039534" Jan 27 15:12:18 crc kubenswrapper[4772]: I0127 15:12:18.159480 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66f698ffcb-hdfc8" Jan 27 15:12:29 crc kubenswrapper[4772]: I0127 15:12:29.949072 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-2dw59" Jan 27 15:12:30 crc kubenswrapper[4772]: I0127 15:12:30.001035 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-crlcr"] Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.049296 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" podUID="877de785-bc18-4c1c-970a-1e6533539467" containerName="registry" containerID="cri-o://228e6fd0668bf433c1f6aa09021f79564dfe5e7bb750301de0ab0cbfce9f1ef2" gracePeriod=30 Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.340084 4772 generic.go:334] "Generic (PLEG): container finished" podID="877de785-bc18-4c1c-970a-1e6533539467" containerID="228e6fd0668bf433c1f6aa09021f79564dfe5e7bb750301de0ab0cbfce9f1ef2" exitCode=0 Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.340193 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" event={"ID":"877de785-bc18-4c1c-970a-1e6533539467","Type":"ContainerDied","Data":"228e6fd0668bf433c1f6aa09021f79564dfe5e7bb750301de0ab0cbfce9f1ef2"} Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.402088 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567125 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-trusted-ca\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567185 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h66t\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-kube-api-access-5h66t\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567206 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-bound-sa-token\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567246 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/877de785-bc18-4c1c-970a-1e6533539467-ca-trust-extracted\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567322 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-registry-certificates\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567338 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-registry-tls\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567539 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.567578 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/877de785-bc18-4c1c-970a-1e6533539467-installation-pull-secrets\") pod \"877de785-bc18-4c1c-970a-1e6533539467\" (UID: \"877de785-bc18-4c1c-970a-1e6533539467\") " Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.568322 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.568536 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.572774 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877de785-bc18-4c1c-970a-1e6533539467-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.572822 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.572970 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.576460 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.577387 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-kube-api-access-5h66t" (OuterVolumeSpecName: "kube-api-access-5h66t") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "kube-api-access-5h66t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.582485 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877de785-bc18-4c1c-970a-1e6533539467-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "877de785-bc18-4c1c-970a-1e6533539467" (UID: "877de785-bc18-4c1c-970a-1e6533539467"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669672 4772 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/877de785-bc18-4c1c-970a-1e6533539467-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669726 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669751 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h66t\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-kube-api-access-5h66t\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669769 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669786 4772 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/877de785-bc18-4c1c-970a-1e6533539467-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669803 4772 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/877de785-bc18-4c1c-970a-1e6533539467-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:55 crc kubenswrapper[4772]: I0127 15:12:55.669820 4772 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/877de785-bc18-4c1c-970a-1e6533539467-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:12:56 crc kubenswrapper[4772]: I0127 15:12:56.347313 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" event={"ID":"877de785-bc18-4c1c-970a-1e6533539467","Type":"ContainerDied","Data":"a3585a039b9cbf60a67ac7ced2eaf947fce2a88abe7705503eb446ef5ad9fc74"} Jan 27 15:12:56 crc kubenswrapper[4772]: I0127 15:12:56.347362 4772 scope.go:117] "RemoveContainer" containerID="228e6fd0668bf433c1f6aa09021f79564dfe5e7bb750301de0ab0cbfce9f1ef2" Jan 27 15:12:56 crc kubenswrapper[4772]: I0127 15:12:56.347398 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-crlcr" Jan 27 15:12:56 crc kubenswrapper[4772]: I0127 15:12:56.373564 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-crlcr"] Jan 27 15:12:56 crc kubenswrapper[4772]: I0127 15:12:56.378382 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-crlcr"] Jan 27 15:12:56 crc kubenswrapper[4772]: I0127 15:12:56.673033 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877de785-bc18-4c1c-970a-1e6533539467" path="/var/lib/kubelet/pods/877de785-bc18-4c1c-970a-1e6533539467/volumes" Jan 27 15:13:12 crc kubenswrapper[4772]: I0127 15:13:12.058747 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:13:12 crc kubenswrapper[4772]: I0127 15:13:12.059107 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:13:42 crc kubenswrapper[4772]: I0127 15:13:42.058850 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:13:42 crc kubenswrapper[4772]: I0127 15:13:42.059624 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.058216 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.058944 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.058993 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.059798 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e72007caa5160368d39dc40b9c7f95a9beba3bef9f9e290eac1d112ef6eeb10"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.059886 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://8e72007caa5160368d39dc40b9c7f95a9beba3bef9f9e290eac1d112ef6eeb10" gracePeriod=600 Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.757429 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="8e72007caa5160368d39dc40b9c7f95a9beba3bef9f9e290eac1d112ef6eeb10" exitCode=0 Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.757481 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"8e72007caa5160368d39dc40b9c7f95a9beba3bef9f9e290eac1d112ef6eeb10"} Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.757886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"32659ec7f069b0827082828bb6142c20199821498a042e5f263706f6e96e9462"} Jan 27 15:14:12 crc kubenswrapper[4772]: I0127 15:14:12.757935 4772 scope.go:117] "RemoveContainer" containerID="0d95f231ee1013dc5475acac704b796538ef0050cd94e435a3382bd12b7cbf19" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.158094 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g"] Jan 27 15:15:00 crc kubenswrapper[4772]: E0127 15:15:00.158822 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877de785-bc18-4c1c-970a-1e6533539467" containerName="registry" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.158835 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="877de785-bc18-4c1c-970a-1e6533539467" containerName="registry" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.158929 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="877de785-bc18-4c1c-970a-1e6533539467" containerName="registry" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.159312 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.161382 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.163274 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.174078 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g"] Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.326589 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-config-volume\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.326661 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r96xf\" (UniqueName: \"kubernetes.io/projected/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-kube-api-access-r96xf\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.326703 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-secret-volume\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.428224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-secret-volume\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.428301 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-config-volume\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.428353 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r96xf\" (UniqueName: \"kubernetes.io/projected/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-kube-api-access-r96xf\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.429433 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-config-volume\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.434676 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-secret-volume\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.449154 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r96xf\" (UniqueName: \"kubernetes.io/projected/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-kube-api-access-r96xf\") pod \"collect-profiles-29492115-hb89g\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.480968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:00 crc kubenswrapper[4772]: I0127 15:15:00.658689 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g"] Jan 27 15:15:01 crc kubenswrapper[4772]: I0127 15:15:01.037121 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" event={"ID":"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1","Type":"ContainerStarted","Data":"8931f0cc38dd8c453e687a0b65ac6a9c2d9a0265440b30f5480c6ac0483f9860"} Jan 27 15:15:01 crc kubenswrapper[4772]: I0127 15:15:01.037471 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" event={"ID":"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1","Type":"ContainerStarted","Data":"d2a67b192739d3b926b87f8d49b2048266ed3826d7d2d9ed2f68c533ca1f4d5b"} Jan 27 15:15:02 crc kubenswrapper[4772]: I0127 15:15:02.044566 4772 generic.go:334] "Generic (PLEG): container finished" podID="616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" containerID="8931f0cc38dd8c453e687a0b65ac6a9c2d9a0265440b30f5480c6ac0483f9860" exitCode=0 Jan 27 15:15:02 crc kubenswrapper[4772]: I0127 15:15:02.044609 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" event={"ID":"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1","Type":"ContainerDied","Data":"8931f0cc38dd8c453e687a0b65ac6a9c2d9a0265440b30f5480c6ac0483f9860"} Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.261037 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.363517 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-config-volume\") pod \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.363598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-secret-volume\") pod \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.363719 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r96xf\" (UniqueName: \"kubernetes.io/projected/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-kube-api-access-r96xf\") pod \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\" (UID: \"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1\") " Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.364228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" (UID: "616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.368978 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" (UID: "616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.368976 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-kube-api-access-r96xf" (OuterVolumeSpecName: "kube-api-access-r96xf") pod "616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" (UID: "616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1"). InnerVolumeSpecName "kube-api-access-r96xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.465476 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.465517 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:15:03 crc kubenswrapper[4772]: I0127 15:15:03.465529 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r96xf\" (UniqueName: \"kubernetes.io/projected/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1-kube-api-access-r96xf\") on node \"crc\" DevicePath \"\"" Jan 27 15:15:04 crc kubenswrapper[4772]: I0127 15:15:04.059783 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" event={"ID":"616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1","Type":"ContainerDied","Data":"d2a67b192739d3b926b87f8d49b2048266ed3826d7d2d9ed2f68c533ca1f4d5b"} Jan 27 15:15:04 crc kubenswrapper[4772]: I0127 15:15:04.059835 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2a67b192739d3b926b87f8d49b2048266ed3826d7d2d9ed2f68c533ca1f4d5b" Jan 27 15:15:04 crc kubenswrapper[4772]: I0127 15:15:04.059902 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g" Jan 27 15:15:54 crc kubenswrapper[4772]: I0127 15:15:54.857366 4772 scope.go:117] "RemoveContainer" containerID="437c578755bfcacf0145c1b3dcede3b1938b4e11e6ad9c7db9d8ac6a8b6df37e" Jan 27 15:16:12 crc kubenswrapper[4772]: I0127 15:16:12.058880 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:16:12 crc kubenswrapper[4772]: I0127 15:16:12.059904 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:16:42 crc kubenswrapper[4772]: I0127 15:16:42.059212 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:16:42 crc kubenswrapper[4772]: I0127 15:16:42.060349 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:17:12 crc kubenswrapper[4772]: I0127 15:17:12.058873 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:17:12 crc kubenswrapper[4772]: I0127 15:17:12.059454 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:17:12 crc kubenswrapper[4772]: I0127 15:17:12.059500 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:17:12 crc kubenswrapper[4772]: I0127 15:17:12.060069 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32659ec7f069b0827082828bb6142c20199821498a042e5f263706f6e96e9462"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:17:12 crc kubenswrapper[4772]: I0127 15:17:12.060120 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://32659ec7f069b0827082828bb6142c20199821498a042e5f263706f6e96e9462" gracePeriod=600 Jan 27 15:17:13 crc kubenswrapper[4772]: I0127 15:17:13.122923 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="32659ec7f069b0827082828bb6142c20199821498a042e5f263706f6e96e9462" exitCode=0 Jan 27 15:17:13 crc kubenswrapper[4772]: I0127 15:17:13.122991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"32659ec7f069b0827082828bb6142c20199821498a042e5f263706f6e96e9462"} Jan 27 15:17:13 crc kubenswrapper[4772]: I0127 15:17:13.123291 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"60c798dfb542a875b90e857bf6f54352abce005f4bc0c5fd246c1b5d0903e3f3"} Jan 27 15:17:13 crc kubenswrapper[4772]: I0127 15:17:13.123326 4772 scope.go:117] "RemoveContainer" containerID="8e72007caa5160368d39dc40b9c7f95a9beba3bef9f9e290eac1d112ef6eeb10" Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.865063 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n2khk"] Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.867021 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-controller" containerID="cri-o://e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.867108 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="nbdb" containerID="cri-o://0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.867404 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="sbdb" containerID="cri-o://0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.867410 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="northd" containerID="cri-o://5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.867504 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.867475 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-node" containerID="cri-o://321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.868362 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-acl-logging" containerID="cri-o://3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" gracePeriod=30 Jan 27 15:18:48 crc kubenswrapper[4772]: I0127 15:18:48.914255 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" containerID="cri-o://7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" gracePeriod=30 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.228342 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/3.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.232084 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovn-acl-logging/0.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.232747 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovn-controller/0.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.233240 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285344 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qml47"] Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285661 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kubecfg-setup" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285690 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kubecfg-setup" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285704 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285710 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285720 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="nbdb" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285728 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="nbdb" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285738 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285743 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285754 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285761 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285773 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="sbdb" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285783 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="sbdb" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285793 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-node" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285800 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-node" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285807 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285812 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285819 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285826 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285834 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="northd" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285841 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="northd" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285855 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285862 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285871 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" containerName="collect-profiles" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285878 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" containerName="collect-profiles" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.285888 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-acl-logging" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.285896 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-acl-logging" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286015 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286030 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286039 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286048 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-acl-logging" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286057 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" containerName="collect-profiles" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286065 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="kube-rbac-proxy-node" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286074 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="northd" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286081 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="sbdb" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286091 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="nbdb" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286099 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286106 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovn-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.286307 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286320 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286422 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.286433 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerName="ovnkube-controller" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.289430 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.339515 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-ovn\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.339741 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-var-lib-cni-networks-ovn-kubernetes\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.339823 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-env-overrides\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.339944 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-log-socket\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340031 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-netns\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340107 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-slash\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340211 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-netd\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340314 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dt6g\" (UniqueName: \"kubernetes.io/projected/736264c8-cd18-479a-88ba-e1ec15dbfdae-kube-api-access-2dt6g\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340408 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-etc-openvswitch\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.339642 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.339786 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340117 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340108 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-log-socket" (OuterVolumeSpecName: "log-socket") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340134 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-slash" (OuterVolumeSpecName: "host-slash") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340310 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340434 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340471 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340484 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-systemd\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340561 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-openvswitch\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340579 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-bin\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340602 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-config\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340619 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-node-log\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340636 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-var-lib-openvswitch\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340655 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-kubelet\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340674 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-script-lib\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340690 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovn-node-metrics-cert\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340720 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-systemd-units\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340735 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-ovn-kubernetes\") pod \"736264c8-cd18-479a-88ba-e1ec15dbfdae\" (UID: \"736264c8-cd18-479a-88ba-e1ec15dbfdae\") " Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340837 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff07e8b2-325b-4f96-b685-f2068052a960-ovn-node-metrics-cert\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340878 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-ovnkube-config\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340895 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-run-netns\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340911 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-ovn\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340950 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-var-lib-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340970 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-env-overrides\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.340995 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7p2\" (UniqueName: \"kubernetes.io/projected/ff07e8b2-325b-4f96-b685-f2068052a960-kube-api-access-8b7p2\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341021 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-log-socket\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341047 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-slash\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341071 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-cni-bin\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-kubelet\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341122 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-systemd-units\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341145 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-cni-netd\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341193 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-node-log\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341216 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-etc-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341236 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-ovnkube-script-lib\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341258 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-systemd\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341275 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-run-ovn-kubernetes\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341305 4772 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341314 4772 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341325 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341334 4772 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-log-socket\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341342 4772 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341349 4772 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-slash\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341357 4772 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341365 4772 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341401 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341419 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341651 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-node-log" (OuterVolumeSpecName: "node-log") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341711 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341741 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.341793 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.342206 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.342291 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.342301 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.348445 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.348561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736264c8-cd18-479a-88ba-e1ec15dbfdae-kube-api-access-2dt6g" (OuterVolumeSpecName: "kube-api-access-2dt6g") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "kube-api-access-2dt6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.359339 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "736264c8-cd18-479a-88ba-e1ec15dbfdae" (UID: "736264c8-cd18-479a-88ba-e1ec15dbfdae"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-cni-bin\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442744 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-kubelet\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442771 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-systemd-units\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442627 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-cni-bin\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442822 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-cni-netd\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-cni-netd\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-systemd-units\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442901 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-kubelet\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442922 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-node-log\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.442990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-node-log\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443041 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-etc-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443098 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-ovnkube-script-lib\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443111 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-etc-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-systemd\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443260 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-systemd\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443271 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-run-ovn-kubernetes\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443326 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff07e8b2-325b-4f96-b685-f2068052a960-ovn-node-metrics-cert\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443376 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-run-ovn-kubernetes\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443382 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443437 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443461 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-ovnkube-config\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443485 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-run-netns\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443506 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-ovn\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443530 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443571 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-var-lib-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443595 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-env-overrides\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443623 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7p2\" (UniqueName: \"kubernetes.io/projected/ff07e8b2-325b-4f96-b685-f2068052a960-kube-api-access-8b7p2\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443633 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-run-netns\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-log-socket\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-slash\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443742 4772 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443772 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443785 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443799 4772 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443814 4772 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443827 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dt6g\" (UniqueName: \"kubernetes.io/projected/736264c8-cd18-479a-88ba-e1ec15dbfdae-kube-api-access-2dt6g\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443837 4772 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443849 4772 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443861 4772 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443872 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/736264c8-cd18-479a-88ba-e1ec15dbfdae-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443882 4772 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-node-log\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443893 4772 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/736264c8-cd18-479a-88ba-e1ec15dbfdae-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443926 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-slash\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443958 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-run-ovn\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.443987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.444016 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-var-lib-openvswitch\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.444886 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff07e8b2-325b-4f96-b685-f2068052a960-log-socket\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.444908 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-ovnkube-script-lib\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.445246 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-env-overrides\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.445422 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff07e8b2-325b-4f96-b685-f2068052a960-ovnkube-config\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.448340 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff07e8b2-325b-4f96-b685-f2068052a960-ovn-node-metrics-cert\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.461644 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7p2\" (UniqueName: \"kubernetes.io/projected/ff07e8b2-325b-4f96-b685-f2068052a960-kube-api-access-8b7p2\") pod \"ovnkube-node-qml47\" (UID: \"ff07e8b2-325b-4f96-b685-f2068052a960\") " pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.605891 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.707541 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovnkube-controller/3.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.711459 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovn-acl-logging/0.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.711978 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n2khk_736264c8-cd18-479a-88ba-e1ec15dbfdae/ovn-controller/0.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712389 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" exitCode=0 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712436 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" exitCode=0 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712455 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" exitCode=0 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712469 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" exitCode=0 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712486 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" exitCode=0 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712501 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" exitCode=0 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712520 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" exitCode=143 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712535 4772 generic.go:334] "Generic (PLEG): container finished" podID="736264c8-cd18-479a-88ba-e1ec15dbfdae" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" exitCode=143 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712556 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712621 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712679 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712708 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712769 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712812 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712831 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712843 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712856 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712868 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712882 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712895 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712907 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712920 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712937 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712954 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712970 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712982 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.712995 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713007 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713018 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713031 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713042 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713054 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713066 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713082 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713099 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713112 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713123 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713135 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713146 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713157 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713204 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713217 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713228 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713239 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713258 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n2khk" event={"ID":"736264c8-cd18-479a-88ba-e1ec15dbfdae","Type":"ContainerDied","Data":"b6a209d8fc4e180971a6f92a0f3c7493472a2095b6c5303a9b0ce0f4e62056a9"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713276 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713289 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713301 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713313 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713324 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713335 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713347 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713358 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713370 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713382 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.713407 4772 scope.go:117] "RemoveContainer" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.715709 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/2.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.724492 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/1.log" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.724625 4772 generic.go:334] "Generic (PLEG): container finished" podID="87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8" containerID="a5fee45d3fc79618abfe1fb780f6741fbf20558f07d7edf5c931f442a9c1c7dd" exitCode=2 Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.724849 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerDied","Data":"a5fee45d3fc79618abfe1fb780f6741fbf20558f07d7edf5c931f442a9c1c7dd"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.724911 4772 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.725495 4772 scope.go:117] "RemoveContainer" containerID="a5fee45d3fc79618abfe1fb780f6741fbf20558f07d7edf5c931f442a9c1c7dd" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.725772 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-x7jwx_openshift-multus(87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8)\"" pod="openshift-multus/multus-x7jwx" podUID="87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.731195 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"ac4a5d287e4ae272f3c908bc4bec13137183b7668478f6a7b3fd983a452c66ad"} Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.751646 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.773834 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n2khk"] Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.778688 4772 scope.go:117] "RemoveContainer" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.786988 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n2khk"] Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.849039 4772 scope.go:117] "RemoveContainer" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.860408 4772 scope.go:117] "RemoveContainer" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.874055 4772 scope.go:117] "RemoveContainer" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.885303 4772 scope.go:117] "RemoveContainer" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.896699 4772 scope.go:117] "RemoveContainer" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.911506 4772 scope.go:117] "RemoveContainer" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.926553 4772 scope.go:117] "RemoveContainer" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.937900 4772 scope.go:117] "RemoveContainer" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.938371 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": container with ID starting with 7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823 not found: ID does not exist" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.938415 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} err="failed to get container status \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": rpc error: code = NotFound desc = could not find container \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": container with ID starting with 7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.938434 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.938686 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": container with ID starting with 8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a not found: ID does not exist" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.938720 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} err="failed to get container status \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": rpc error: code = NotFound desc = could not find container \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": container with ID starting with 8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.938743 4772 scope.go:117] "RemoveContainer" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.939083 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": container with ID starting with 0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae not found: ID does not exist" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.939134 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} err="failed to get container status \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": rpc error: code = NotFound desc = could not find container \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": container with ID starting with 0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.939154 4772 scope.go:117] "RemoveContainer" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.939502 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": container with ID starting with 0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b not found: ID does not exist" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.939528 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} err="failed to get container status \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": rpc error: code = NotFound desc = could not find container \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": container with ID starting with 0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.939544 4772 scope.go:117] "RemoveContainer" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.940155 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": container with ID starting with 5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4 not found: ID does not exist" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.940211 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} err="failed to get container status \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": rpc error: code = NotFound desc = could not find container \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": container with ID starting with 5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.940230 4772 scope.go:117] "RemoveContainer" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.941419 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": container with ID starting with 45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5 not found: ID does not exist" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.941439 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} err="failed to get container status \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": rpc error: code = NotFound desc = could not find container \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": container with ID starting with 45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.941452 4772 scope.go:117] "RemoveContainer" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.941703 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": container with ID starting with 321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9 not found: ID does not exist" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.941728 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} err="failed to get container status \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": rpc error: code = NotFound desc = could not find container \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": container with ID starting with 321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.941742 4772 scope.go:117] "RemoveContainer" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.942109 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": container with ID starting with 3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854 not found: ID does not exist" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.942133 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} err="failed to get container status \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": rpc error: code = NotFound desc = could not find container \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": container with ID starting with 3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.942149 4772 scope.go:117] "RemoveContainer" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.942662 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": container with ID starting with e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e not found: ID does not exist" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.942682 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} err="failed to get container status \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": rpc error: code = NotFound desc = could not find container \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": container with ID starting with e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.942695 4772 scope.go:117] "RemoveContainer" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" Jan 27 15:18:49 crc kubenswrapper[4772]: E0127 15:18:49.942888 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": container with ID starting with c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d not found: ID does not exist" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.942910 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} err="failed to get container status \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": rpc error: code = NotFound desc = could not find container \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": container with ID starting with c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.942923 4772 scope.go:117] "RemoveContainer" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943102 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} err="failed to get container status \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": rpc error: code = NotFound desc = could not find container \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": container with ID starting with 7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943121 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943335 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} err="failed to get container status \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": rpc error: code = NotFound desc = could not find container \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": container with ID starting with 8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943355 4772 scope.go:117] "RemoveContainer" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943518 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} err="failed to get container status \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": rpc error: code = NotFound desc = could not find container \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": container with ID starting with 0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943540 4772 scope.go:117] "RemoveContainer" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943701 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} err="failed to get container status \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": rpc error: code = NotFound desc = could not find container \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": container with ID starting with 0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943716 4772 scope.go:117] "RemoveContainer" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943849 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} err="failed to get container status \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": rpc error: code = NotFound desc = could not find container \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": container with ID starting with 5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.943863 4772 scope.go:117] "RemoveContainer" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944024 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} err="failed to get container status \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": rpc error: code = NotFound desc = could not find container \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": container with ID starting with 45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944046 4772 scope.go:117] "RemoveContainer" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944299 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} err="failed to get container status \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": rpc error: code = NotFound desc = could not find container \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": container with ID starting with 321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944317 4772 scope.go:117] "RemoveContainer" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944482 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} err="failed to get container status \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": rpc error: code = NotFound desc = could not find container \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": container with ID starting with 3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944505 4772 scope.go:117] "RemoveContainer" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944679 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} err="failed to get container status \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": rpc error: code = NotFound desc = could not find container \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": container with ID starting with e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944702 4772 scope.go:117] "RemoveContainer" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944902 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} err="failed to get container status \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": rpc error: code = NotFound desc = could not find container \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": container with ID starting with c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.944933 4772 scope.go:117] "RemoveContainer" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945121 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} err="failed to get container status \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": rpc error: code = NotFound desc = could not find container \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": container with ID starting with 7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945138 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945376 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} err="failed to get container status \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": rpc error: code = NotFound desc = could not find container \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": container with ID starting with 8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945395 4772 scope.go:117] "RemoveContainer" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945684 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} err="failed to get container status \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": rpc error: code = NotFound desc = could not find container \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": container with ID starting with 0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945706 4772 scope.go:117] "RemoveContainer" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945930 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} err="failed to get container status \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": rpc error: code = NotFound desc = could not find container \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": container with ID starting with 0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.945954 4772 scope.go:117] "RemoveContainer" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.946330 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} err="failed to get container status \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": rpc error: code = NotFound desc = could not find container \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": container with ID starting with 5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.946379 4772 scope.go:117] "RemoveContainer" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.947089 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} err="failed to get container status \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": rpc error: code = NotFound desc = could not find container \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": container with ID starting with 45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.947110 4772 scope.go:117] "RemoveContainer" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.947437 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} err="failed to get container status \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": rpc error: code = NotFound desc = could not find container \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": container with ID starting with 321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.947483 4772 scope.go:117] "RemoveContainer" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.947725 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} err="failed to get container status \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": rpc error: code = NotFound desc = could not find container \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": container with ID starting with 3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.947746 4772 scope.go:117] "RemoveContainer" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948073 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} err="failed to get container status \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": rpc error: code = NotFound desc = could not find container \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": container with ID starting with e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948096 4772 scope.go:117] "RemoveContainer" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948411 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} err="failed to get container status \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": rpc error: code = NotFound desc = could not find container \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": container with ID starting with c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948432 4772 scope.go:117] "RemoveContainer" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948602 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} err="failed to get container status \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": rpc error: code = NotFound desc = could not find container \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": container with ID starting with 7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948621 4772 scope.go:117] "RemoveContainer" containerID="8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948796 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a"} err="failed to get container status \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": rpc error: code = NotFound desc = could not find container \"8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a\": container with ID starting with 8724eea2ef6df0dd65b48200e79bf0dd04d8e8a658ef59e0d5dbbc706d4cf68a not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.948832 4772 scope.go:117] "RemoveContainer" containerID="0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949027 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae"} err="failed to get container status \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": rpc error: code = NotFound desc = could not find container \"0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae\": container with ID starting with 0b8a45e6cc2ed92af13f48ae2d40ef8ba713fda7f78a0f0df8728375aa1326ae not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949051 4772 scope.go:117] "RemoveContainer" containerID="0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949230 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b"} err="failed to get container status \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": rpc error: code = NotFound desc = could not find container \"0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b\": container with ID starting with 0672785b54a52ada7c5ecb3813df46790ca221c9234910436fb48c6c16bdbb3b not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949253 4772 scope.go:117] "RemoveContainer" containerID="5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949436 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4"} err="failed to get container status \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": rpc error: code = NotFound desc = could not find container \"5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4\": container with ID starting with 5589864ab0df6b8ca9f810dea1168b5b16ac2b158531aac819758f0281dadba4 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949455 4772 scope.go:117] "RemoveContainer" containerID="45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949635 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5"} err="failed to get container status \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": rpc error: code = NotFound desc = could not find container \"45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5\": container with ID starting with 45a46f78c67486ce0034eefead74ba09c52e5ced21e7914963a219fef85efcc5 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949655 4772 scope.go:117] "RemoveContainer" containerID="321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949864 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9"} err="failed to get container status \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": rpc error: code = NotFound desc = could not find container \"321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9\": container with ID starting with 321c3da626a22e29558283e4efb292c6e039764e36318c3686f783e48ad876b9 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.949884 4772 scope.go:117] "RemoveContainer" containerID="3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.950376 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854"} err="failed to get container status \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": rpc error: code = NotFound desc = could not find container \"3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854\": container with ID starting with 3de2db48c4b82a23cce25c1be5a9c9e66439c283f7651d322e94015b94dc7854 not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.950415 4772 scope.go:117] "RemoveContainer" containerID="e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.950666 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e"} err="failed to get container status \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": rpc error: code = NotFound desc = could not find container \"e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e\": container with ID starting with e68c9acebcb335ee630582ff98a6406849766800d848c6f4a7d87e22a65e1e1e not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.950684 4772 scope.go:117] "RemoveContainer" containerID="c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.950891 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d"} err="failed to get container status \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": rpc error: code = NotFound desc = could not find container \"c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d\": container with ID starting with c55ceb84c37125c4dc988d6e8c3ea65d854c89190c2ac8f32c191d6d55e2982d not found: ID does not exist" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.950928 4772 scope.go:117] "RemoveContainer" containerID="7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823" Jan 27 15:18:49 crc kubenswrapper[4772]: I0127 15:18:49.951192 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823"} err="failed to get container status \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": rpc error: code = NotFound desc = could not find container \"7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823\": container with ID starting with 7bcf11983997321ee81682fb1ab65d69810342d1d15ef9f8da9f8d1344cdc823 not found: ID does not exist" Jan 27 15:18:50 crc kubenswrapper[4772]: I0127 15:18:50.670672 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736264c8-cd18-479a-88ba-e1ec15dbfdae" path="/var/lib/kubelet/pods/736264c8-cd18-479a-88ba-e1ec15dbfdae/volumes" Jan 27 15:18:50 crc kubenswrapper[4772]: I0127 15:18:50.739765 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff07e8b2-325b-4f96-b685-f2068052a960" containerID="93cf0cb8c7e8fc64f2417d90ba2ce26fdbca27a4bbf34b0ab6539feb8b83bf96" exitCode=0 Jan 27 15:18:50 crc kubenswrapper[4772]: I0127 15:18:50.739845 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerDied","Data":"93cf0cb8c7e8fc64f2417d90ba2ce26fdbca27a4bbf34b0ab6539feb8b83bf96"} Jan 27 15:18:51 crc kubenswrapper[4772]: I0127 15:18:51.752690 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"f75eb4ed67c4fe2b28f19628b2f7e25812cf89c51eb13a3fc08dbf049eddcaa5"} Jan 27 15:18:51 crc kubenswrapper[4772]: I0127 15:18:51.753140 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"aaca2c527bc95b49c9ba9bae28800d2aae6784e5558342ef024c11be4d341405"} Jan 27 15:18:51 crc kubenswrapper[4772]: I0127 15:18:51.753155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"537c0ae0bfa35738f9cff8a98c8948a3a4160a7c65ec7f2a8b377177e94338d4"} Jan 27 15:18:51 crc kubenswrapper[4772]: I0127 15:18:51.753186 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"500953b46c83a06e86a04b002f891b91017d531482a6d4c341f1b652687440e7"} Jan 27 15:18:51 crc kubenswrapper[4772]: I0127 15:18:51.753199 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"d9c68bc8eb48037e17d05979d4605cba35a4f6b495e3950a2b99610cd7b14c31"} Jan 27 15:18:51 crc kubenswrapper[4772]: I0127 15:18:51.753210 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"58042ebd592c2e1a476cd7c933626dc515b1acd1a401b13d469c688dacd59b3f"} Jan 27 15:18:53 crc kubenswrapper[4772]: I0127 15:18:53.774620 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"2562ba36a42df9c7d6943315e35e2a1bf5af741384e77eb33177ba315e7fbc56"} Jan 27 15:18:54 crc kubenswrapper[4772]: I0127 15:18:54.917809 4772 scope.go:117] "RemoveContainer" containerID="9f72b451fa77f3fce2c251de546110ab49c7c9e0122759f6ef29a32fde422356" Jan 27 15:18:55 crc kubenswrapper[4772]: I0127 15:18:55.787858 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/2.log" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.801624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" event={"ID":"ff07e8b2-325b-4f96-b685-f2068052a960","Type":"ContainerStarted","Data":"22cf781b1425a3e566222d387ab7f24fc9a07b9dde7c46d5838e46a939cb4c7a"} Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.801960 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.801974 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.801984 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.836474 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.836922 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.840968 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" podStartSLOduration=7.840952459 podStartE2EDuration="7.840952459s" podCreationTimestamp="2026-01-27 15:18:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:18:56.839968571 +0000 UTC m=+722.820577699" watchObservedRunningTime="2026-01-27 15:18:56.840952459 +0000 UTC m=+722.821561557" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.974012 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-pmrs5"] Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.974727 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.976627 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.976757 4772 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-r9nmx" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.977084 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 27 15:18:56 crc kubenswrapper[4772]: I0127 15:18:56.977373 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.018594 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pmrs5"] Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.040743 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79mhd\" (UniqueName: \"kubernetes.io/projected/32a5cafc-0519-4e90-9456-acb182176c41-kube-api-access-79mhd\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.040813 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/32a5cafc-0519-4e90-9456-acb182176c41-crc-storage\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.040840 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/32a5cafc-0519-4e90-9456-acb182176c41-node-mnt\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.141458 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79mhd\" (UniqueName: \"kubernetes.io/projected/32a5cafc-0519-4e90-9456-acb182176c41-kube-api-access-79mhd\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.141526 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/32a5cafc-0519-4e90-9456-acb182176c41-crc-storage\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.141553 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/32a5cafc-0519-4e90-9456-acb182176c41-node-mnt\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.141867 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/32a5cafc-0519-4e90-9456-acb182176c41-node-mnt\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.143284 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/32a5cafc-0519-4e90-9456-acb182176c41-crc-storage\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.161007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79mhd\" (UniqueName: \"kubernetes.io/projected/32a5cafc-0519-4e90-9456-acb182176c41-kube-api-access-79mhd\") pod \"crc-storage-crc-pmrs5\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.289364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.313534 4772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(e72254d7c4699290e0bb88e53cbb2017bdf725db6e241c5fea9269ea5eac8473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.313724 4772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(e72254d7c4699290e0bb88e53cbb2017bdf725db6e241c5fea9269ea5eac8473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.313798 4772 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(e72254d7c4699290e0bb88e53cbb2017bdf725db6e241c5fea9269ea5eac8473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.313895 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-pmrs5_crc-storage(32a5cafc-0519-4e90-9456-acb182176c41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-pmrs5_crc-storage(32a5cafc-0519-4e90-9456-acb182176c41)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(e72254d7c4699290e0bb88e53cbb2017bdf725db6e241c5fea9269ea5eac8473): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-pmrs5" podUID="32a5cafc-0519-4e90-9456-acb182176c41" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.807082 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: I0127 15:18:57.807582 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.835432 4772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(0b81e4bb3394a9db9e768032e5b4b727da59aed8fa29834672c7ceadbc239a0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.835503 4772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(0b81e4bb3394a9db9e768032e5b4b727da59aed8fa29834672c7ceadbc239a0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.835528 4772 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(0b81e4bb3394a9db9e768032e5b4b727da59aed8fa29834672c7ceadbc239a0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:18:57 crc kubenswrapper[4772]: E0127 15:18:57.835580 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-pmrs5_crc-storage(32a5cafc-0519-4e90-9456-acb182176c41)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-pmrs5_crc-storage(32a5cafc-0519-4e90-9456-acb182176c41)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-pmrs5_crc-storage_32a5cafc-0519-4e90-9456-acb182176c41_0(0b81e4bb3394a9db9e768032e5b4b727da59aed8fa29834672c7ceadbc239a0e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-pmrs5" podUID="32a5cafc-0519-4e90-9456-acb182176c41" Jan 27 15:19:03 crc kubenswrapper[4772]: I0127 15:19:03.663886 4772 scope.go:117] "RemoveContainer" containerID="a5fee45d3fc79618abfe1fb780f6741fbf20558f07d7edf5c931f442a9c1c7dd" Jan 27 15:19:03 crc kubenswrapper[4772]: I0127 15:19:03.847302 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/2.log" Jan 27 15:19:04 crc kubenswrapper[4772]: I0127 15:19:04.856996 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x7jwx_87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8/kube-multus/2.log" Jan 27 15:19:04 crc kubenswrapper[4772]: I0127 15:19:04.858038 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x7jwx" event={"ID":"87cb2a5b-099e-4a3b-a0bc-cba76a1a00a8","Type":"ContainerStarted","Data":"2f8786cb2f7e8a784f7db02cc01cff531e7a0af9267bee88f64e672f700cc88f"} Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.058266 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.058817 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.663644 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.664454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.865480 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-pmrs5"] Jan 27 15:19:12 crc kubenswrapper[4772]: W0127 15:19:12.870199 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a5cafc_0519_4e90_9456_acb182176c41.slice/crio-cc832d30aabf335cefcf2aff9b9f4129cc8d1a2b7ef5bc41178d9a657d238dcc WatchSource:0}: Error finding container cc832d30aabf335cefcf2aff9b9f4129cc8d1a2b7ef5bc41178d9a657d238dcc: Status 404 returned error can't find the container with id cc832d30aabf335cefcf2aff9b9f4129cc8d1a2b7ef5bc41178d9a657d238dcc Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.872748 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:19:12 crc kubenswrapper[4772]: I0127 15:19:12.914226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pmrs5" event={"ID":"32a5cafc-0519-4e90-9456-acb182176c41","Type":"ContainerStarted","Data":"cc832d30aabf335cefcf2aff9b9f4129cc8d1a2b7ef5bc41178d9a657d238dcc"} Jan 27 15:19:14 crc kubenswrapper[4772]: I0127 15:19:14.925951 4772 generic.go:334] "Generic (PLEG): container finished" podID="32a5cafc-0519-4e90-9456-acb182176c41" containerID="f4e8f8b6e9c9139e4588eb373fc8616c60521a1a5d0cbfb79f4f8c9d4dc676b9" exitCode=0 Jan 27 15:19:14 crc kubenswrapper[4772]: I0127 15:19:14.926049 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pmrs5" event={"ID":"32a5cafc-0519-4e90-9456-acb182176c41","Type":"ContainerDied","Data":"f4e8f8b6e9c9139e4588eb373fc8616c60521a1a5d0cbfb79f4f8c9d4dc676b9"} Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.207656 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.327047 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79mhd\" (UniqueName: \"kubernetes.io/projected/32a5cafc-0519-4e90-9456-acb182176c41-kube-api-access-79mhd\") pod \"32a5cafc-0519-4e90-9456-acb182176c41\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.327212 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/32a5cafc-0519-4e90-9456-acb182176c41-node-mnt\") pod \"32a5cafc-0519-4e90-9456-acb182176c41\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.327284 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32a5cafc-0519-4e90-9456-acb182176c41-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "32a5cafc-0519-4e90-9456-acb182176c41" (UID: "32a5cafc-0519-4e90-9456-acb182176c41"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.327963 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/32a5cafc-0519-4e90-9456-acb182176c41-crc-storage\") pod \"32a5cafc-0519-4e90-9456-acb182176c41\" (UID: \"32a5cafc-0519-4e90-9456-acb182176c41\") " Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.328189 4772 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/32a5cafc-0519-4e90-9456-acb182176c41-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.333727 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a5cafc-0519-4e90-9456-acb182176c41-kube-api-access-79mhd" (OuterVolumeSpecName: "kube-api-access-79mhd") pod "32a5cafc-0519-4e90-9456-acb182176c41" (UID: "32a5cafc-0519-4e90-9456-acb182176c41"). InnerVolumeSpecName "kube-api-access-79mhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.347332 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a5cafc-0519-4e90-9456-acb182176c41-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "32a5cafc-0519-4e90-9456-acb182176c41" (UID: "32a5cafc-0519-4e90-9456-acb182176c41"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.429088 4772 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/32a5cafc-0519-4e90-9456-acb182176c41-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.429134 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79mhd\" (UniqueName: \"kubernetes.io/projected/32a5cafc-0519-4e90-9456-acb182176c41-kube-api-access-79mhd\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.945825 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-pmrs5" event={"ID":"32a5cafc-0519-4e90-9456-acb182176c41","Type":"ContainerDied","Data":"cc832d30aabf335cefcf2aff9b9f4129cc8d1a2b7ef5bc41178d9a657d238dcc"} Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.945879 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc832d30aabf335cefcf2aff9b9f4129cc8d1a2b7ef5bc41178d9a657d238dcc" Jan 27 15:19:16 crc kubenswrapper[4772]: I0127 15:19:16.945947 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-pmrs5" Jan 27 15:19:19 crc kubenswrapper[4772]: I0127 15:19:19.634584 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qml47" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.124718 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf"] Jan 27 15:19:23 crc kubenswrapper[4772]: E0127 15:19:23.125256 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a5cafc-0519-4e90-9456-acb182176c41" containerName="storage" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.125269 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a5cafc-0519-4e90-9456-acb182176c41" containerName="storage" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.125373 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a5cafc-0519-4e90-9456-acb182176c41" containerName="storage" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.126021 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.128321 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.133456 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf"] Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.322750 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.322843 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcjv\" (UniqueName: \"kubernetes.io/projected/d583171b-99cd-49da-9a9f-48931806cb45-kube-api-access-lwcjv\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.322968 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.423983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.424070 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcjv\" (UniqueName: \"kubernetes.io/projected/d583171b-99cd-49da-9a9f-48931806cb45-kube-api-access-lwcjv\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.424601 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.425300 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.425591 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.444467 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcjv\" (UniqueName: \"kubernetes.io/projected/d583171b-99cd-49da-9a9f-48931806cb45-kube-api-access-lwcjv\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.743628 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:23 crc kubenswrapper[4772]: I0127 15:19:23.971346 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf"] Jan 27 15:19:24 crc kubenswrapper[4772]: I0127 15:19:24.994919 4772 generic.go:334] "Generic (PLEG): container finished" podID="d583171b-99cd-49da-9a9f-48931806cb45" containerID="809f06b005772e28360eaf3f49b070a296c1f8e64d6ac832b3fd457afe7571b1" exitCode=0 Jan 27 15:19:24 crc kubenswrapper[4772]: I0127 15:19:24.995010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" event={"ID":"d583171b-99cd-49da-9a9f-48931806cb45","Type":"ContainerDied","Data":"809f06b005772e28360eaf3f49b070a296c1f8e64d6ac832b3fd457afe7571b1"} Jan 27 15:19:24 crc kubenswrapper[4772]: I0127 15:19:24.995304 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" event={"ID":"d583171b-99cd-49da-9a9f-48931806cb45","Type":"ContainerStarted","Data":"aaf6d95824b5eed2d7c5b70fa582cfb3a7afd152705a6bd3badccbd05f100bf9"} Jan 27 15:19:27 crc kubenswrapper[4772]: I0127 15:19:27.009947 4772 generic.go:334] "Generic (PLEG): container finished" podID="d583171b-99cd-49da-9a9f-48931806cb45" containerID="3edc59963bb2eb5ccee335b1bb8a54cddb9975464ce2916da2d9b900d384092c" exitCode=0 Jan 27 15:19:27 crc kubenswrapper[4772]: I0127 15:19:27.010085 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" event={"ID":"d583171b-99cd-49da-9a9f-48931806cb45","Type":"ContainerDied","Data":"3edc59963bb2eb5ccee335b1bb8a54cddb9975464ce2916da2d9b900d384092c"} Jan 27 15:19:28 crc kubenswrapper[4772]: I0127 15:19:28.016357 4772 generic.go:334] "Generic (PLEG): container finished" podID="d583171b-99cd-49da-9a9f-48931806cb45" containerID="9a8a4fd5cb4bc936b5cac14dfa14122988a0b6fbe2b78903ad0bf46c28a9b6d8" exitCode=0 Jan 27 15:19:28 crc kubenswrapper[4772]: I0127 15:19:28.016395 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" event={"ID":"d583171b-99cd-49da-9a9f-48931806cb45","Type":"ContainerDied","Data":"9a8a4fd5cb4bc936b5cac14dfa14122988a0b6fbe2b78903ad0bf46c28a9b6d8"} Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.274772 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.406135 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-util\") pod \"d583171b-99cd-49da-9a9f-48931806cb45\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.406257 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-bundle\") pod \"d583171b-99cd-49da-9a9f-48931806cb45\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.406290 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwcjv\" (UniqueName: \"kubernetes.io/projected/d583171b-99cd-49da-9a9f-48931806cb45-kube-api-access-lwcjv\") pod \"d583171b-99cd-49da-9a9f-48931806cb45\" (UID: \"d583171b-99cd-49da-9a9f-48931806cb45\") " Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.406867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-bundle" (OuterVolumeSpecName: "bundle") pod "d583171b-99cd-49da-9a9f-48931806cb45" (UID: "d583171b-99cd-49da-9a9f-48931806cb45"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.412320 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d583171b-99cd-49da-9a9f-48931806cb45-kube-api-access-lwcjv" (OuterVolumeSpecName: "kube-api-access-lwcjv") pod "d583171b-99cd-49da-9a9f-48931806cb45" (UID: "d583171b-99cd-49da-9a9f-48931806cb45"). InnerVolumeSpecName "kube-api-access-lwcjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.420663 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-util" (OuterVolumeSpecName: "util") pod "d583171b-99cd-49da-9a9f-48931806cb45" (UID: "d583171b-99cd-49da-9a9f-48931806cb45"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.497773 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wjx8r"] Jan 27 15:19:29 crc kubenswrapper[4772]: E0127 15:19:29.498032 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="pull" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.498045 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="pull" Jan 27 15:19:29 crc kubenswrapper[4772]: E0127 15:19:29.498059 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="extract" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.498065 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="extract" Jan 27 15:19:29 crc kubenswrapper[4772]: E0127 15:19:29.498083 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="util" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.498091 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="util" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.498213 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d583171b-99cd-49da-9a9f-48931806cb45" containerName="extract" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.498959 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.504916 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjx8r"] Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.507886 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.508033 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d583171b-99cd-49da-9a9f-48931806cb45-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.508137 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwcjv\" (UniqueName: \"kubernetes.io/projected/d583171b-99cd-49da-9a9f-48931806cb45-kube-api-access-lwcjv\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.610005 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-catalog-content\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.610080 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-utilities\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.610132 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbc5f\" (UniqueName: \"kubernetes.io/projected/09a099f0-cf75-4584-9369-e146e1898ee5-kube-api-access-pbc5f\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.710995 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbc5f\" (UniqueName: \"kubernetes.io/projected/09a099f0-cf75-4584-9369-e146e1898ee5-kube-api-access-pbc5f\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.711069 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-catalog-content\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.711133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-utilities\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.711856 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-utilities\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.712070 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-catalog-content\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.731969 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbc5f\" (UniqueName: \"kubernetes.io/projected/09a099f0-cf75-4584-9369-e146e1898ee5-kube-api-access-pbc5f\") pod \"redhat-operators-wjx8r\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:29 crc kubenswrapper[4772]: I0127 15:19:29.829655 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:30 crc kubenswrapper[4772]: I0127 15:19:30.043479 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wjx8r"] Jan 27 15:19:30 crc kubenswrapper[4772]: I0127 15:19:30.043855 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" event={"ID":"d583171b-99cd-49da-9a9f-48931806cb45","Type":"ContainerDied","Data":"aaf6d95824b5eed2d7c5b70fa582cfb3a7afd152705a6bd3badccbd05f100bf9"} Jan 27 15:19:30 crc kubenswrapper[4772]: I0127 15:19:30.043883 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaf6d95824b5eed2d7c5b70fa582cfb3a7afd152705a6bd3badccbd05f100bf9" Jan 27 15:19:30 crc kubenswrapper[4772]: I0127 15:19:30.043652 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf" Jan 27 15:19:30 crc kubenswrapper[4772]: W0127 15:19:30.043916 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09a099f0_cf75_4584_9369_e146e1898ee5.slice/crio-5400ef2d30491c889dad91bc65f72f745b56aba86dca10ba2c7a00a785a6ae51 WatchSource:0}: Error finding container 5400ef2d30491c889dad91bc65f72f745b56aba86dca10ba2c7a00a785a6ae51: Status 404 returned error can't find the container with id 5400ef2d30491c889dad91bc65f72f745b56aba86dca10ba2c7a00a785a6ae51 Jan 27 15:19:30 crc kubenswrapper[4772]: I0127 15:19:30.203937 4772 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.050485 4772 generic.go:334] "Generic (PLEG): container finished" podID="09a099f0-cf75-4584-9369-e146e1898ee5" containerID="db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1" exitCode=0 Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.050542 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerDied","Data":"db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1"} Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.050810 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerStarted","Data":"5400ef2d30491c889dad91bc65f72f745b56aba86dca10ba2c7a00a785a6ae51"} Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.379444 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mdjph"] Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.380257 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.382925 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.383028 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bcpxd" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.385568 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.397669 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mdjph"] Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.433285 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qg8\" (UniqueName: \"kubernetes.io/projected/2914eab0-19c8-464b-a774-d30a492f6763-kube-api-access-n5qg8\") pod \"nmstate-operator-646758c888-mdjph\" (UID: \"2914eab0-19c8-464b-a774-d30a492f6763\") " pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.534339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qg8\" (UniqueName: \"kubernetes.io/projected/2914eab0-19c8-464b-a774-d30a492f6763-kube-api-access-n5qg8\") pod \"nmstate-operator-646758c888-mdjph\" (UID: \"2914eab0-19c8-464b-a774-d30a492f6763\") " pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.559663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qg8\" (UniqueName: \"kubernetes.io/projected/2914eab0-19c8-464b-a774-d30a492f6763-kube-api-access-n5qg8\") pod \"nmstate-operator-646758c888-mdjph\" (UID: \"2914eab0-19c8-464b-a774-d30a492f6763\") " pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.697676 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" Jan 27 15:19:31 crc kubenswrapper[4772]: I0127 15:19:31.960418 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mdjph"] Jan 27 15:19:31 crc kubenswrapper[4772]: W0127 15:19:31.971600 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2914eab0_19c8_464b_a774_d30a492f6763.slice/crio-6c6fdd92e5a1aee735a193b46e66d29f340458e8944dfdf216d0e03d42c2b335 WatchSource:0}: Error finding container 6c6fdd92e5a1aee735a193b46e66d29f340458e8944dfdf216d0e03d42c2b335: Status 404 returned error can't find the container with id 6c6fdd92e5a1aee735a193b46e66d29f340458e8944dfdf216d0e03d42c2b335 Jan 27 15:19:32 crc kubenswrapper[4772]: I0127 15:19:32.057608 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerStarted","Data":"a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9"} Jan 27 15:19:32 crc kubenswrapper[4772]: I0127 15:19:32.059016 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" event={"ID":"2914eab0-19c8-464b-a774-d30a492f6763","Type":"ContainerStarted","Data":"6c6fdd92e5a1aee735a193b46e66d29f340458e8944dfdf216d0e03d42c2b335"} Jan 27 15:19:33 crc kubenswrapper[4772]: I0127 15:19:33.066640 4772 generic.go:334] "Generic (PLEG): container finished" podID="09a099f0-cf75-4584-9369-e146e1898ee5" containerID="a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9" exitCode=0 Jan 27 15:19:33 crc kubenswrapper[4772]: I0127 15:19:33.066681 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerDied","Data":"a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9"} Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.076567 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" event={"ID":"2914eab0-19c8-464b-a774-d30a492f6763","Type":"ContainerStarted","Data":"15e2a155bf300de194c72a842f30d8c0505b2430fc699c4937e38716d89daf3e"} Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.079561 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerStarted","Data":"c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef"} Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.099184 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-mdjph" podStartSLOduration=2.004480249 podStartE2EDuration="4.09915238s" podCreationTimestamp="2026-01-27 15:19:31 +0000 UTC" firstStartedPulling="2026-01-27 15:19:31.97434937 +0000 UTC m=+757.954958468" lastFinishedPulling="2026-01-27 15:19:34.069021501 +0000 UTC m=+760.049630599" observedRunningTime="2026-01-27 15:19:35.095484415 +0000 UTC m=+761.076093513" watchObservedRunningTime="2026-01-27 15:19:35.09915238 +0000 UTC m=+761.079761478" Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.121237 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wjx8r" podStartSLOduration=3.106218951 podStartE2EDuration="6.121208746s" podCreationTimestamp="2026-01-27 15:19:29 +0000 UTC" firstStartedPulling="2026-01-27 15:19:31.052306666 +0000 UTC m=+757.032915764" lastFinishedPulling="2026-01-27 15:19:34.067296461 +0000 UTC m=+760.047905559" observedRunningTime="2026-01-27 15:19:35.120345321 +0000 UTC m=+761.100954479" watchObservedRunningTime="2026-01-27 15:19:35.121208746 +0000 UTC m=+761.101817884" Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.951260 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-g7d66"] Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.952798 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.954981 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-9xfrz" Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.956479 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc"] Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.957332 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.961414 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.964275 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-g7d66"] Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.982703 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc"] Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.986504 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mmdc5"] Jan 27 15:19:35 crc kubenswrapper[4772]: I0127 15:19:35.987247 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.000809 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.000879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp8kd\" (UniqueName: \"kubernetes.io/projected/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-kube-api-access-hp8kd\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.000924 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-nmstate-lock\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.001029 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-dbus-socket\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.001061 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjm9\" (UniqueName: \"kubernetes.io/projected/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-kube-api-access-8rjm9\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.001110 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-ovs-socket\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.001142 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skgs\" (UniqueName: \"kubernetes.io/projected/004d59b7-1d3b-41af-8c3d-c6562dd9716a-kube-api-access-2skgs\") pod \"nmstate-metrics-54757c584b-g7d66\" (UID: \"004d59b7-1d3b-41af-8c3d-c6562dd9716a\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.092225 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99"] Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.093009 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.095836 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cc7j4" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.096044 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.096188 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105379 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3706a5f9-4370-4cca-abb9-b23e8b9c828f-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105444 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-ovs-socket\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skgs\" (UniqueName: \"kubernetes.io/projected/004d59b7-1d3b-41af-8c3d-c6562dd9716a-kube-api-access-2skgs\") pod \"nmstate-metrics-54757c584b-g7d66\" (UID: \"004d59b7-1d3b-41af-8c3d-c6562dd9716a\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105517 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3706a5f9-4370-4cca-abb9-b23e8b9c828f-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105541 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105578 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp8kd\" (UniqueName: \"kubernetes.io/projected/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-kube-api-access-hp8kd\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105621 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-nmstate-lock\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105667 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlw6p\" (UniqueName: \"kubernetes.io/projected/3706a5f9-4370-4cca-abb9-b23e8b9c828f-kube-api-access-mlw6p\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105701 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-dbus-socket\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105898 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjm9\" (UniqueName: \"kubernetes.io/projected/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-kube-api-access-8rjm9\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.105958 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-ovs-socket\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.106008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-dbus-socket\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: E0127 15:19:36.106106 4772 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 27 15:19:36 crc kubenswrapper[4772]: E0127 15:19:36.106145 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-tls-key-pair podName:cf21b49c-f01b-4c7c-bdb9-57e115b364d9 nodeName:}" failed. No retries permitted until 2026-01-27 15:19:36.606130313 +0000 UTC m=+762.586739411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-gpfqc" (UID: "cf21b49c-f01b-4c7c-bdb9-57e115b364d9") : secret "openshift-nmstate-webhook" not found Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.106479 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-nmstate-lock\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.108356 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99"] Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.137881 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp8kd\" (UniqueName: \"kubernetes.io/projected/d834ccf6-9b3a-4a3e-8980-7f0a102babd0-kube-api-access-hp8kd\") pod \"nmstate-handler-mmdc5\" (UID: \"d834ccf6-9b3a-4a3e-8980-7f0a102babd0\") " pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.140138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skgs\" (UniqueName: \"kubernetes.io/projected/004d59b7-1d3b-41af-8c3d-c6562dd9716a-kube-api-access-2skgs\") pod \"nmstate-metrics-54757c584b-g7d66\" (UID: \"004d59b7-1d3b-41af-8c3d-c6562dd9716a\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.145932 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjm9\" (UniqueName: \"kubernetes.io/projected/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-kube-api-access-8rjm9\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.207355 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3706a5f9-4370-4cca-abb9-b23e8b9c828f-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.207492 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlw6p\" (UniqueName: \"kubernetes.io/projected/3706a5f9-4370-4cca-abb9-b23e8b9c828f-kube-api-access-mlw6p\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.207550 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3706a5f9-4370-4cca-abb9-b23e8b9c828f-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: E0127 15:19:36.208158 4772 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 27 15:19:36 crc kubenswrapper[4772]: E0127 15:19:36.208258 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3706a5f9-4370-4cca-abb9-b23e8b9c828f-plugin-serving-cert podName:3706a5f9-4370-4cca-abb9-b23e8b9c828f nodeName:}" failed. No retries permitted until 2026-01-27 15:19:36.708236866 +0000 UTC m=+762.688845974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3706a5f9-4370-4cca-abb9-b23e8b9c828f-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-jlq99" (UID: "3706a5f9-4370-4cca-abb9-b23e8b9c828f") : secret "plugin-serving-cert" not found Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.210548 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3706a5f9-4370-4cca-abb9-b23e8b9c828f-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.232657 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlw6p\" (UniqueName: \"kubernetes.io/projected/3706a5f9-4370-4cca-abb9-b23e8b9c828f-kube-api-access-mlw6p\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.274679 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.290425 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7587874955-cvhbn"] Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.291073 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316450 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-trusted-ca-bundle\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316489 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-oauth-serving-cert\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316516 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7rn\" (UniqueName: \"kubernetes.io/projected/b88eb902-4d85-4dc3-9336-daec648107cf-kube-api-access-sh7rn\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316639 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-service-ca\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316683 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b88eb902-4d85-4dc3-9336-daec648107cf-console-oauth-config\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316761 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-console-config\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.316831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b88eb902-4d85-4dc3-9336-daec648107cf-console-serving-cert\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.318032 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.320065 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7587874955-cvhbn"] Jan 27 15:19:36 crc kubenswrapper[4772]: W0127 15:19:36.345543 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd834ccf6_9b3a_4a3e_8980_7f0a102babd0.slice/crio-74cd6762872bf6b32e6299908edcfd6a6e011bf2e45c20e81a5fc62f98313049 WatchSource:0}: Error finding container 74cd6762872bf6b32e6299908edcfd6a6e011bf2e45c20e81a5fc62f98313049: Status 404 returned error can't find the container with id 74cd6762872bf6b32e6299908edcfd6a6e011bf2e45c20e81a5fc62f98313049 Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.417801 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b88eb902-4d85-4dc3-9336-daec648107cf-console-serving-cert\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.418277 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-trusted-ca-bundle\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.418300 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-oauth-serving-cert\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.418326 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7rn\" (UniqueName: \"kubernetes.io/projected/b88eb902-4d85-4dc3-9336-daec648107cf-kube-api-access-sh7rn\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.418364 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-service-ca\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.418391 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b88eb902-4d85-4dc3-9336-daec648107cf-console-oauth-config\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.418440 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-console-config\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.419933 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-oauth-serving-cert\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.420027 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-service-ca\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.422026 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-trusted-ca-bundle\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.423915 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b88eb902-4d85-4dc3-9336-daec648107cf-console-config\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.425138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b88eb902-4d85-4dc3-9336-daec648107cf-console-oauth-config\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.425182 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b88eb902-4d85-4dc3-9336-daec648107cf-console-serving-cert\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.442718 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7rn\" (UniqueName: \"kubernetes.io/projected/b88eb902-4d85-4dc3-9336-daec648107cf-kube-api-access-sh7rn\") pod \"console-7587874955-cvhbn\" (UID: \"b88eb902-4d85-4dc3-9336-daec648107cf\") " pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.533458 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-g7d66"] Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.620994 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.624950 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21b49c-f01b-4c7c-bdb9-57e115b364d9-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-gpfqc\" (UID: \"cf21b49c-f01b-4c7c-bdb9-57e115b364d9\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.649128 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.721868 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3706a5f9-4370-4cca-abb9-b23e8b9c828f-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.725432 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3706a5f9-4370-4cca-abb9-b23e8b9c828f-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-jlq99\" (UID: \"3706a5f9-4370-4cca-abb9-b23e8b9c828f\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:36 crc kubenswrapper[4772]: I0127 15:19:36.883921 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.020343 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.049507 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc"] Jan 27 15:19:37 crc kubenswrapper[4772]: W0127 15:19:37.056757 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf21b49c_f01b_4c7c_bdb9_57e115b364d9.slice/crio-cb4a0bda422e0b5b92ec8b7453b03549dbfb27214199a98cdb713ea2cb4962a5 WatchSource:0}: Error finding container cb4a0bda422e0b5b92ec8b7453b03549dbfb27214199a98cdb713ea2cb4962a5: Status 404 returned error can't find the container with id cb4a0bda422e0b5b92ec8b7453b03549dbfb27214199a98cdb713ea2cb4962a5 Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.096151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" event={"ID":"004d59b7-1d3b-41af-8c3d-c6562dd9716a","Type":"ContainerStarted","Data":"bbc361de727bd2369e0573f411a5acac824bb4d34535d084ae81adcfc98bf23f"} Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.097236 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mmdc5" event={"ID":"d834ccf6-9b3a-4a3e-8980-7f0a102babd0","Type":"ContainerStarted","Data":"74cd6762872bf6b32e6299908edcfd6a6e011bf2e45c20e81a5fc62f98313049"} Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.098351 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" event={"ID":"cf21b49c-f01b-4c7c-bdb9-57e115b364d9","Type":"ContainerStarted","Data":"cb4a0bda422e0b5b92ec8b7453b03549dbfb27214199a98cdb713ea2cb4962a5"} Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.101835 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7587874955-cvhbn"] Jan 27 15:19:37 crc kubenswrapper[4772]: W0127 15:19:37.124713 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb88eb902_4d85_4dc3_9336_daec648107cf.slice/crio-8f0d2778bdef050f5fb3d11428ef1cb145879b491c8be1450efa14ff19e4b0f5 WatchSource:0}: Error finding container 8f0d2778bdef050f5fb3d11428ef1cb145879b491c8be1450efa14ff19e4b0f5: Status 404 returned error can't find the container with id 8f0d2778bdef050f5fb3d11428ef1cb145879b491c8be1450efa14ff19e4b0f5 Jan 27 15:19:37 crc kubenswrapper[4772]: I0127 15:19:37.256568 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99"] Jan 27 15:19:37 crc kubenswrapper[4772]: W0127 15:19:37.263431 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3706a5f9_4370_4cca_abb9_b23e8b9c828f.slice/crio-b16dd94e3004bea0e6593538807931500b18201385283e58cf176558aa7a8a91 WatchSource:0}: Error finding container b16dd94e3004bea0e6593538807931500b18201385283e58cf176558aa7a8a91: Status 404 returned error can't find the container with id b16dd94e3004bea0e6593538807931500b18201385283e58cf176558aa7a8a91 Jan 27 15:19:38 crc kubenswrapper[4772]: I0127 15:19:38.107673 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" event={"ID":"3706a5f9-4370-4cca-abb9-b23e8b9c828f","Type":"ContainerStarted","Data":"b16dd94e3004bea0e6593538807931500b18201385283e58cf176558aa7a8a91"} Jan 27 15:19:38 crc kubenswrapper[4772]: I0127 15:19:38.110353 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7587874955-cvhbn" event={"ID":"b88eb902-4d85-4dc3-9336-daec648107cf","Type":"ContainerStarted","Data":"c2d567fb0b2f6dbdef0f4485f0fba6e81e898002431a48fc7faa6d20c6dde697"} Jan 27 15:19:38 crc kubenswrapper[4772]: I0127 15:19:38.110383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7587874955-cvhbn" event={"ID":"b88eb902-4d85-4dc3-9336-daec648107cf","Type":"ContainerStarted","Data":"8f0d2778bdef050f5fb3d11428ef1cb145879b491c8be1450efa14ff19e4b0f5"} Jan 27 15:19:38 crc kubenswrapper[4772]: I0127 15:19:38.150341 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7587874955-cvhbn" podStartSLOduration=2.150321828 podStartE2EDuration="2.150321828s" podCreationTimestamp="2026-01-27 15:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:19:38.1466067 +0000 UTC m=+764.127215798" watchObservedRunningTime="2026-01-27 15:19:38.150321828 +0000 UTC m=+764.130930946" Jan 27 15:19:39 crc kubenswrapper[4772]: I0127 15:19:39.830320 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:39 crc kubenswrapper[4772]: I0127 15:19:39.830637 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:40 crc kubenswrapper[4772]: I0127 15:19:40.865272 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wjx8r" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="registry-server" probeResult="failure" output=< Jan 27 15:19:40 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 15:19:40 crc kubenswrapper[4772]: > Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.144026 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" event={"ID":"cf21b49c-f01b-4c7c-bdb9-57e115b364d9","Type":"ContainerStarted","Data":"936192f8fc1de4210fb2ca33015efcb6990e6ba9a61d80f6df411fc8be3e1edc"} Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.144220 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.147957 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" event={"ID":"004d59b7-1d3b-41af-8c3d-c6562dd9716a","Type":"ContainerStarted","Data":"c90797f7a014cf412ef32210553a472dee22c58f527a9768fc7f51ad8be45f87"} Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.150765 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mmdc5" event={"ID":"d834ccf6-9b3a-4a3e-8980-7f0a102babd0","Type":"ContainerStarted","Data":"a907a3c154ee056d510026b823e6420e785bf97e6a98b4c06bb43b132e2ed452"} Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.150989 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.161289 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" podStartSLOduration=3.071497556 podStartE2EDuration="6.161269216s" podCreationTimestamp="2026-01-27 15:19:35 +0000 UTC" firstStartedPulling="2026-01-27 15:19:37.060159229 +0000 UTC m=+763.040768327" lastFinishedPulling="2026-01-27 15:19:40.149930889 +0000 UTC m=+766.130539987" observedRunningTime="2026-01-27 15:19:41.160203775 +0000 UTC m=+767.140812893" watchObservedRunningTime="2026-01-27 15:19:41.161269216 +0000 UTC m=+767.141878314" Jan 27 15:19:41 crc kubenswrapper[4772]: I0127 15:19:41.183307 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mmdc5" podStartSLOduration=2.353455832 podStartE2EDuration="6.183286851s" podCreationTimestamp="2026-01-27 15:19:35 +0000 UTC" firstStartedPulling="2026-01-27 15:19:36.350790294 +0000 UTC m=+762.331399392" lastFinishedPulling="2026-01-27 15:19:40.180621313 +0000 UTC m=+766.161230411" observedRunningTime="2026-01-27 15:19:41.182137817 +0000 UTC m=+767.162746935" watchObservedRunningTime="2026-01-27 15:19:41.183286851 +0000 UTC m=+767.163895949" Jan 27 15:19:42 crc kubenswrapper[4772]: I0127 15:19:42.059021 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:19:42 crc kubenswrapper[4772]: I0127 15:19:42.059091 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:19:43 crc kubenswrapper[4772]: I0127 15:19:43.165696 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" event={"ID":"004d59b7-1d3b-41af-8c3d-c6562dd9716a","Type":"ContainerStarted","Data":"3e8f259bfabcaf4122367dc58f7e2c3cab40fdbf3006a7ed27c8bd4cfc4c525c"} Jan 27 15:19:43 crc kubenswrapper[4772]: I0127 15:19:43.189059 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-g7d66" podStartSLOduration=2.200072662 podStartE2EDuration="8.189039939s" podCreationTimestamp="2026-01-27 15:19:35 +0000 UTC" firstStartedPulling="2026-01-27 15:19:36.557567694 +0000 UTC m=+762.538176792" lastFinishedPulling="2026-01-27 15:19:42.546534931 +0000 UTC m=+768.527144069" observedRunningTime="2026-01-27 15:19:43.183092768 +0000 UTC m=+769.163701876" watchObservedRunningTime="2026-01-27 15:19:43.189039939 +0000 UTC m=+769.169649037" Jan 27 15:19:46 crc kubenswrapper[4772]: I0127 15:19:46.352534 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mmdc5" Jan 27 15:19:46 crc kubenswrapper[4772]: I0127 15:19:46.649304 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:46 crc kubenswrapper[4772]: I0127 15:19:46.649367 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:46 crc kubenswrapper[4772]: I0127 15:19:46.659374 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:47 crc kubenswrapper[4772]: I0127 15:19:47.198032 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7587874955-cvhbn" Jan 27 15:19:47 crc kubenswrapper[4772]: I0127 15:19:47.251264 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7qfrl"] Jan 27 15:19:49 crc kubenswrapper[4772]: I0127 15:19:49.869368 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:49 crc kubenswrapper[4772]: I0127 15:19:49.959428 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:50 crc kubenswrapper[4772]: I0127 15:19:50.121976 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjx8r"] Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.220586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" event={"ID":"3706a5f9-4370-4cca-abb9-b23e8b9c828f","Type":"ContainerStarted","Data":"1bf82255d91ce78638063827979b83642dcb63bc78c82d7f9b9cce2badea6e57"} Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.220761 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wjx8r" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="registry-server" containerID="cri-o://c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef" gracePeriod=2 Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.260007 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-jlq99" podStartSLOduration=2.230594602 podStartE2EDuration="15.259971589s" podCreationTimestamp="2026-01-27 15:19:36 +0000 UTC" firstStartedPulling="2026-01-27 15:19:37.265919129 +0000 UTC m=+763.246528217" lastFinishedPulling="2026-01-27 15:19:50.295296116 +0000 UTC m=+776.275905204" observedRunningTime="2026-01-27 15:19:51.246341066 +0000 UTC m=+777.226950204" watchObservedRunningTime="2026-01-27 15:19:51.259971589 +0000 UTC m=+777.240580727" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.554484 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.575273 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-utilities\") pod \"09a099f0-cf75-4584-9369-e146e1898ee5\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.575364 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbc5f\" (UniqueName: \"kubernetes.io/projected/09a099f0-cf75-4584-9369-e146e1898ee5-kube-api-access-pbc5f\") pod \"09a099f0-cf75-4584-9369-e146e1898ee5\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.575490 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-catalog-content\") pod \"09a099f0-cf75-4584-9369-e146e1898ee5\" (UID: \"09a099f0-cf75-4584-9369-e146e1898ee5\") " Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.580449 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-utilities" (OuterVolumeSpecName: "utilities") pod "09a099f0-cf75-4584-9369-e146e1898ee5" (UID: "09a099f0-cf75-4584-9369-e146e1898ee5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.580887 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.581155 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a099f0-cf75-4584-9369-e146e1898ee5-kube-api-access-pbc5f" (OuterVolumeSpecName: "kube-api-access-pbc5f") pod "09a099f0-cf75-4584-9369-e146e1898ee5" (UID: "09a099f0-cf75-4584-9369-e146e1898ee5"). InnerVolumeSpecName "kube-api-access-pbc5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.682331 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbc5f\" (UniqueName: \"kubernetes.io/projected/09a099f0-cf75-4584-9369-e146e1898ee5-kube-api-access-pbc5f\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.731677 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09a099f0-cf75-4584-9369-e146e1898ee5" (UID: "09a099f0-cf75-4584-9369-e146e1898ee5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:19:51 crc kubenswrapper[4772]: I0127 15:19:51.783913 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09a099f0-cf75-4584-9369-e146e1898ee5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.236944 4772 generic.go:334] "Generic (PLEG): container finished" podID="09a099f0-cf75-4584-9369-e146e1898ee5" containerID="c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef" exitCode=0 Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.237013 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wjx8r" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.237061 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerDied","Data":"c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef"} Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.237123 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wjx8r" event={"ID":"09a099f0-cf75-4584-9369-e146e1898ee5","Type":"ContainerDied","Data":"5400ef2d30491c889dad91bc65f72f745b56aba86dca10ba2c7a00a785a6ae51"} Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.237152 4772 scope.go:117] "RemoveContainer" containerID="c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.261886 4772 scope.go:117] "RemoveContainer" containerID="a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.276332 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wjx8r"] Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.282148 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wjx8r"] Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.298963 4772 scope.go:117] "RemoveContainer" containerID="db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.313431 4772 scope.go:117] "RemoveContainer" containerID="c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef" Jan 27 15:19:52 crc kubenswrapper[4772]: E0127 15:19:52.314109 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef\": container with ID starting with c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef not found: ID does not exist" containerID="c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.314159 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef"} err="failed to get container status \"c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef\": rpc error: code = NotFound desc = could not find container \"c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef\": container with ID starting with c89368d0949f4bd3cbe349eab2729e718b9b735415848548f5754e07c04a3cef not found: ID does not exist" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.314203 4772 scope.go:117] "RemoveContainer" containerID="a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9" Jan 27 15:19:52 crc kubenswrapper[4772]: E0127 15:19:52.314613 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9\": container with ID starting with a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9 not found: ID does not exist" containerID="a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.314668 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9"} err="failed to get container status \"a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9\": rpc error: code = NotFound desc = could not find container \"a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9\": container with ID starting with a779b499675d02b76b862fa155c06070ae6ad9b590cec73ad50ae9e5d0cd44a9 not found: ID does not exist" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.314705 4772 scope.go:117] "RemoveContainer" containerID="db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1" Jan 27 15:19:52 crc kubenswrapper[4772]: E0127 15:19:52.315094 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1\": container with ID starting with db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1 not found: ID does not exist" containerID="db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.315123 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1"} err="failed to get container status \"db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1\": rpc error: code = NotFound desc = could not find container \"db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1\": container with ID starting with db8ffbbcaeec3abdc8460ac13f60a94b11906e285d02efbfc2441a057fa123e1 not found: ID does not exist" Jan 27 15:19:52 crc kubenswrapper[4772]: I0127 15:19:52.669153 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" path="/var/lib/kubelet/pods/09a099f0-cf75-4584-9369-e146e1898ee5/volumes" Jan 27 15:19:56 crc kubenswrapper[4772]: I0127 15:19:56.893423 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-gpfqc" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.693106 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk"] Jan 27 15:20:08 crc kubenswrapper[4772]: E0127 15:20:08.694487 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="registry-server" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.694506 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="registry-server" Jan 27 15:20:08 crc kubenswrapper[4772]: E0127 15:20:08.694523 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="extract-content" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.694532 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="extract-content" Jan 27 15:20:08 crc kubenswrapper[4772]: E0127 15:20:08.694555 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="extract-utilities" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.694565 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="extract-utilities" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.694763 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a099f0-cf75-4584-9369-e146e1898ee5" containerName="registry-server" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.697829 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.699866 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.715670 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk"] Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.881570 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.881646 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.881689 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jxtg\" (UniqueName: \"kubernetes.io/projected/09090577-fdfa-4f36-badf-f32c6ee2ab7d-kube-api-access-9jxtg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.983646 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.983733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.983797 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jxtg\" (UniqueName: \"kubernetes.io/projected/09090577-fdfa-4f36-badf-f32c6ee2ab7d-kube-api-access-9jxtg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.984299 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:08 crc kubenswrapper[4772]: I0127 15:20:08.984402 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:09 crc kubenswrapper[4772]: I0127 15:20:09.008711 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jxtg\" (UniqueName: \"kubernetes.io/projected/09090577-fdfa-4f36-badf-f32c6ee2ab7d-kube-api-access-9jxtg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:09 crc kubenswrapper[4772]: I0127 15:20:09.017128 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:09 crc kubenswrapper[4772]: I0127 15:20:09.458737 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk"] Jan 27 15:20:09 crc kubenswrapper[4772]: W0127 15:20:09.470788 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09090577_fdfa_4f36_badf_f32c6ee2ab7d.slice/crio-5eace4e07ae1ddfd9ca33204554ccf4c3b1f352fe916de49d02bb84876aa31e2 WatchSource:0}: Error finding container 5eace4e07ae1ddfd9ca33204554ccf4c3b1f352fe916de49d02bb84876aa31e2: Status 404 returned error can't find the container with id 5eace4e07ae1ddfd9ca33204554ccf4c3b1f352fe916de49d02bb84876aa31e2 Jan 27 15:20:10 crc kubenswrapper[4772]: I0127 15:20:10.355002 4772 generic.go:334] "Generic (PLEG): container finished" podID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerID="439c3bfaa41668060ae94dd39d5966e30d1a6826569c7e63cae066908ef685a6" exitCode=0 Jan 27 15:20:10 crc kubenswrapper[4772]: I0127 15:20:10.355085 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" event={"ID":"09090577-fdfa-4f36-badf-f32c6ee2ab7d","Type":"ContainerDied","Data":"439c3bfaa41668060ae94dd39d5966e30d1a6826569c7e63cae066908ef685a6"} Jan 27 15:20:10 crc kubenswrapper[4772]: I0127 15:20:10.355143 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" event={"ID":"09090577-fdfa-4f36-badf-f32c6ee2ab7d","Type":"ContainerStarted","Data":"5eace4e07ae1ddfd9ca33204554ccf4c3b1f352fe916de49d02bb84876aa31e2"} Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.058335 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.058863 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.058919 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.059590 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"60c798dfb542a875b90e857bf6f54352abce005f4bc0c5fd246c1b5d0903e3f3"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.059664 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://60c798dfb542a875b90e857bf6f54352abce005f4bc0c5fd246c1b5d0903e3f3" gracePeriod=600 Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.300681 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7qfrl" podUID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" containerName="console" containerID="cri-o://5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652" gracePeriod=15 Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.370303 4772 generic.go:334] "Generic (PLEG): container finished" podID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerID="85f37a805cce12fdeb9e9c9c28438e3e6b7638e7b3599585f99fb6660d5b1264" exitCode=0 Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.370387 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" event={"ID":"09090577-fdfa-4f36-badf-f32c6ee2ab7d","Type":"ContainerDied","Data":"85f37a805cce12fdeb9e9c9c28438e3e6b7638e7b3599585f99fb6660d5b1264"} Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.372892 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="60c798dfb542a875b90e857bf6f54352abce005f4bc0c5fd246c1b5d0903e3f3" exitCode=0 Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.372940 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"60c798dfb542a875b90e857bf6f54352abce005f4bc0c5fd246c1b5d0903e3f3"} Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.373042 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"c8213e4fa74445d3800c2dbcb45efc3fb34a6f40c3d5ed5845b811a51d3d8497"} Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.373073 4772 scope.go:117] "RemoveContainer" containerID="32659ec7f069b0827082828bb6142c20199821498a042e5f263706f6e96e9462" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.646309 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7qfrl_e2e31e5f-3a41-42f5-90b0-99c05a8033a6/console/0.log" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.646622 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.844554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-oauth-serving-cert\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845074 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-trusted-ca-bundle\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845205 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-oauth-config\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845345 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmgkc\" (UniqueName: \"kubernetes.io/projected/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-kube-api-access-vmgkc\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845463 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845474 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-service-ca\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845546 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-serving-cert\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845577 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-config\") pod \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\" (UID: \"e2e31e5f-3a41-42f5-90b0-99c05a8033a6\") " Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845937 4772 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.845952 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.846060 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-service-ca" (OuterVolumeSpecName: "service-ca") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.846374 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-config" (OuterVolumeSpecName: "console-config") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.850210 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.850290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.850363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-kube-api-access-vmgkc" (OuterVolumeSpecName: "kube-api-access-vmgkc") pod "e2e31e5f-3a41-42f5-90b0-99c05a8033a6" (UID: "e2e31e5f-3a41-42f5-90b0-99c05a8033a6"). InnerVolumeSpecName "kube-api-access-vmgkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.947082 4772 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.947130 4772 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.947159 4772 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.947203 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmgkc\" (UniqueName: \"kubernetes.io/projected/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-kube-api-access-vmgkc\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:12 crc kubenswrapper[4772]: I0127 15:20:12.947222 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2e31e5f-3a41-42f5-90b0-99c05a8033a6-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.385360 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7qfrl_e2e31e5f-3a41-42f5-90b0-99c05a8033a6/console/0.log" Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.386430 4772 generic.go:334] "Generic (PLEG): container finished" podID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" containerID="5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652" exitCode=2 Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.386506 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7qfrl" Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.386523 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7qfrl" event={"ID":"e2e31e5f-3a41-42f5-90b0-99c05a8033a6","Type":"ContainerDied","Data":"5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652"} Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.386561 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7qfrl" event={"ID":"e2e31e5f-3a41-42f5-90b0-99c05a8033a6","Type":"ContainerDied","Data":"689105dc82b6dcc122fad60678c44aee714f4e2b250e67f0c76903dd34d0b5c3"} Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.386582 4772 scope.go:117] "RemoveContainer" containerID="5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652" Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.389793 4772 generic.go:334] "Generic (PLEG): container finished" podID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerID="1700e0008ac67d7585a3770cc2c3ff72683a1c89a65d5123d41fbd195bc151f9" exitCode=0 Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.389835 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" event={"ID":"09090577-fdfa-4f36-badf-f32c6ee2ab7d","Type":"ContainerDied","Data":"1700e0008ac67d7585a3770cc2c3ff72683a1c89a65d5123d41fbd195bc151f9"} Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.407463 4772 scope.go:117] "RemoveContainer" containerID="5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652" Jan 27 15:20:13 crc kubenswrapper[4772]: E0127 15:20:13.408130 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652\": container with ID starting with 5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652 not found: ID does not exist" containerID="5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652" Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.408318 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652"} err="failed to get container status \"5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652\": rpc error: code = NotFound desc = could not find container \"5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652\": container with ID starting with 5fa14544d0f474c1dab5359f3cbcee7247c22e26d183454cd30d9fa3ab064652 not found: ID does not exist" Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.426882 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7qfrl"] Jan 27 15:20:13 crc kubenswrapper[4772]: I0127 15:20:13.434120 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7qfrl"] Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.677498 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" path="/var/lib/kubelet/pods/e2e31e5f-3a41-42f5-90b0-99c05a8033a6/volumes" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.688028 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.875400 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-util\") pod \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.875564 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-bundle\") pod \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.875621 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jxtg\" (UniqueName: \"kubernetes.io/projected/09090577-fdfa-4f36-badf-f32c6ee2ab7d-kube-api-access-9jxtg\") pod \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\" (UID: \"09090577-fdfa-4f36-badf-f32c6ee2ab7d\") " Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.877504 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-bundle" (OuterVolumeSpecName: "bundle") pod "09090577-fdfa-4f36-badf-f32c6ee2ab7d" (UID: "09090577-fdfa-4f36-badf-f32c6ee2ab7d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.884579 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09090577-fdfa-4f36-badf-f32c6ee2ab7d-kube-api-access-9jxtg" (OuterVolumeSpecName: "kube-api-access-9jxtg") pod "09090577-fdfa-4f36-badf-f32c6ee2ab7d" (UID: "09090577-fdfa-4f36-badf-f32c6ee2ab7d"). InnerVolumeSpecName "kube-api-access-9jxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.909557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-util" (OuterVolumeSpecName: "util") pod "09090577-fdfa-4f36-badf-f32c6ee2ab7d" (UID: "09090577-fdfa-4f36-badf-f32c6ee2ab7d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.978883 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.978934 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/09090577-fdfa-4f36-badf-f32c6ee2ab7d-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:14 crc kubenswrapper[4772]: I0127 15:20:14.978950 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jxtg\" (UniqueName: \"kubernetes.io/projected/09090577-fdfa-4f36-badf-f32c6ee2ab7d-kube-api-access-9jxtg\") on node \"crc\" DevicePath \"\"" Jan 27 15:20:15 crc kubenswrapper[4772]: I0127 15:20:15.410155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" event={"ID":"09090577-fdfa-4f36-badf-f32c6ee2ab7d","Type":"ContainerDied","Data":"5eace4e07ae1ddfd9ca33204554ccf4c3b1f352fe916de49d02bb84876aa31e2"} Jan 27 15:20:15 crc kubenswrapper[4772]: I0127 15:20:15.410869 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eace4e07ae1ddfd9ca33204554ccf4c3b1f352fe916de49d02bb84876aa31e2" Jan 27 15:20:15 crc kubenswrapper[4772]: I0127 15:20:15.410291 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.392647 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4"] Jan 27 15:20:24 crc kubenswrapper[4772]: E0127 15:20:24.393686 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="util" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.393702 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="util" Jan 27 15:20:24 crc kubenswrapper[4772]: E0127 15:20:24.393717 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="extract" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.393725 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="extract" Jan 27 15:20:24 crc kubenswrapper[4772]: E0127 15:20:24.393735 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" containerName="console" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.393743 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" containerName="console" Jan 27 15:20:24 crc kubenswrapper[4772]: E0127 15:20:24.393761 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="pull" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.393767 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="pull" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.393881 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e31e5f-3a41-42f5-90b0-99c05a8033a6" containerName="console" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.393896 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="09090577-fdfa-4f36-badf-f32c6ee2ab7d" containerName="extract" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.394438 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.401396 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.401580 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.401739 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.401935 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tpk9v" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.406279 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.412594 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4"] Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.493722 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnhq\" (UniqueName: \"kubernetes.io/projected/f72c611d-60d8-4649-a410-38434d01d8e2-kube-api-access-sbnhq\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.493778 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f72c611d-60d8-4649-a410-38434d01d8e2-webhook-cert\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.493816 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f72c611d-60d8-4649-a410-38434d01d8e2-apiservice-cert\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.595599 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnhq\" (UniqueName: \"kubernetes.io/projected/f72c611d-60d8-4649-a410-38434d01d8e2-kube-api-access-sbnhq\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.595910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f72c611d-60d8-4649-a410-38434d01d8e2-webhook-cert\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.596045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f72c611d-60d8-4649-a410-38434d01d8e2-apiservice-cert\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.606289 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f72c611d-60d8-4649-a410-38434d01d8e2-apiservice-cert\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.607002 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f72c611d-60d8-4649-a410-38434d01d8e2-webhook-cert\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.625975 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnhq\" (UniqueName: \"kubernetes.io/projected/f72c611d-60d8-4649-a410-38434d01d8e2-kube-api-access-sbnhq\") pod \"metallb-operator-controller-manager-5c6dd9c74b-84qz4\" (UID: \"f72c611d-60d8-4649-a410-38434d01d8e2\") " pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.698632 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms"] Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.699724 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.707007 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zhlwq" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.707177 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.707361 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.712648 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms"] Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.731544 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.899882 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5ee8d7f-0160-4526-8ae0-45a50a450725-apiservice-cert\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.900100 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5ee8d7f-0160-4526-8ae0-45a50a450725-webhook-cert\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:24 crc kubenswrapper[4772]: I0127 15:20:24.900151 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4kqk\" (UniqueName: \"kubernetes.io/projected/c5ee8d7f-0160-4526-8ae0-45a50a450725-kube-api-access-w4kqk\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.001115 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5ee8d7f-0160-4526-8ae0-45a50a450725-apiservice-cert\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.001194 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5ee8d7f-0160-4526-8ae0-45a50a450725-webhook-cert\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.001228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4kqk\" (UniqueName: \"kubernetes.io/projected/c5ee8d7f-0160-4526-8ae0-45a50a450725-kube-api-access-w4kqk\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.007819 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5ee8d7f-0160-4526-8ae0-45a50a450725-apiservice-cert\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.007888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5ee8d7f-0160-4526-8ae0-45a50a450725-webhook-cert\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.024753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4kqk\" (UniqueName: \"kubernetes.io/projected/c5ee8d7f-0160-4526-8ae0-45a50a450725-kube-api-access-w4kqk\") pod \"metallb-operator-webhook-server-66986f9f9f-bmvms\" (UID: \"c5ee8d7f-0160-4526-8ae0-45a50a450725\") " pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.025751 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.215129 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4"] Jan 27 15:20:25 crc kubenswrapper[4772]: W0127 15:20:25.229685 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf72c611d_60d8_4649_a410_38434d01d8e2.slice/crio-4caf69be229218f9b04edf0ad07b0614ac6cb4226b5bfdaff0acb1e3267678b2 WatchSource:0}: Error finding container 4caf69be229218f9b04edf0ad07b0614ac6cb4226b5bfdaff0acb1e3267678b2: Status 404 returned error can't find the container with id 4caf69be229218f9b04edf0ad07b0614ac6cb4226b5bfdaff0acb1e3267678b2 Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.229824 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms"] Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.475057 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" event={"ID":"c5ee8d7f-0160-4526-8ae0-45a50a450725","Type":"ContainerStarted","Data":"7a92d7b3bea78e77245e067e3091d09809cb9ab9290860ca7457eadb8a23f48f"} Jan 27 15:20:25 crc kubenswrapper[4772]: I0127 15:20:25.477897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" event={"ID":"f72c611d-60d8-4649-a410-38434d01d8e2","Type":"ContainerStarted","Data":"4caf69be229218f9b04edf0ad07b0614ac6cb4226b5bfdaff0acb1e3267678b2"} Jan 27 15:20:28 crc kubenswrapper[4772]: I0127 15:20:28.498149 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" event={"ID":"f72c611d-60d8-4649-a410-38434d01d8e2","Type":"ContainerStarted","Data":"2a97c2748ab3c37fc26e8f99921f757adb8b041f0526a0b3adbe03801cfd948f"} Jan 27 15:20:28 crc kubenswrapper[4772]: I0127 15:20:28.498966 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:20:34 crc kubenswrapper[4772]: I0127 15:20:34.551619 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" event={"ID":"c5ee8d7f-0160-4526-8ae0-45a50a450725","Type":"ContainerStarted","Data":"dca38d4ff748fb7a714a98e080648bb0e1e2a8e4d7decd87038933d75280339d"} Jan 27 15:20:34 crc kubenswrapper[4772]: I0127 15:20:34.552663 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:20:34 crc kubenswrapper[4772]: I0127 15:20:34.579130 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" podStartSLOduration=7.516448158 podStartE2EDuration="10.579110537s" podCreationTimestamp="2026-01-27 15:20:24 +0000 UTC" firstStartedPulling="2026-01-27 15:20:25.233055544 +0000 UTC m=+811.213664642" lastFinishedPulling="2026-01-27 15:20:28.295717923 +0000 UTC m=+814.276327021" observedRunningTime="2026-01-27 15:20:28.525072423 +0000 UTC m=+814.505681541" watchObservedRunningTime="2026-01-27 15:20:34.579110537 +0000 UTC m=+820.559719635" Jan 27 15:20:34 crc kubenswrapper[4772]: I0127 15:20:34.579601 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" podStartSLOduration=1.689105377 podStartE2EDuration="10.579594371s" podCreationTimestamp="2026-01-27 15:20:24 +0000 UTC" firstStartedPulling="2026-01-27 15:20:25.24403269 +0000 UTC m=+811.224641798" lastFinishedPulling="2026-01-27 15:20:34.134521694 +0000 UTC m=+820.115130792" observedRunningTime="2026-01-27 15:20:34.570979973 +0000 UTC m=+820.551589131" watchObservedRunningTime="2026-01-27 15:20:34.579594371 +0000 UTC m=+820.560203469" Jan 27 15:20:45 crc kubenswrapper[4772]: I0127 15:20:45.032371 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66986f9f9f-bmvms" Jan 27 15:21:04 crc kubenswrapper[4772]: I0127 15:21:04.734654 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5c6dd9c74b-84qz4" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.490793 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jhpnb"] Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.493693 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.495332 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs"] Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.496145 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.496814 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.497687 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.497804 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-l97nh" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.498140 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.511014 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs"] Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.594443 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-cl54q"] Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.595508 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.597524 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.597744 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.598032 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jz9xz" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.598235 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.622513 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-vdg69"] Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.624541 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.626654 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.633079 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-vdg69"] Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667209 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ghc\" (UniqueName: \"kubernetes.io/projected/28ed9da3-cd29-4d80-9703-472bdbb3c64b-kube-api-access-z6ghc\") pod \"frr-k8s-webhook-server-7df86c4f6c-qpxrs\" (UID: \"28ed9da3-cd29-4d80-9703-472bdbb3c64b\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxg4j\" (UniqueName: \"kubernetes.io/projected/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-kube-api-access-hxg4j\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667299 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-reloader\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667339 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-metrics\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667359 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-startup\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28ed9da3-cd29-4d80-9703-472bdbb3c64b-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-qpxrs\" (UID: \"28ed9da3-cd29-4d80-9703-472bdbb3c64b\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667414 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-conf\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-sockets\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.667487 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-metrics-certs\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gljb\" (UniqueName: \"kubernetes.io/projected/24577bed-b34e-4419-9e9e-7068155ba0d1-kube-api-access-4gljb\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768165 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-sockets\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-metrics-certs\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ghc\" (UniqueName: \"kubernetes.io/projected/28ed9da3-cd29-4d80-9703-472bdbb3c64b-kube-api-access-z6ghc\") pod \"frr-k8s-webhook-server-7df86c4f6c-qpxrs\" (UID: \"28ed9da3-cd29-4d80-9703-472bdbb3c64b\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxg4j\" (UniqueName: \"kubernetes.io/projected/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-kube-api-access-hxg4j\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768321 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-reloader\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768341 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768360 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24577bed-b34e-4419-9e9e-7068155ba0d1-metallb-excludel2\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768389 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs9dw\" (UniqueName: \"kubernetes.io/projected/d2282b46-452e-402e-b929-23875b572727-kube-api-access-gs9dw\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2282b46-452e-402e-b929-23875b572727-metrics-certs\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768437 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-metrics\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768456 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-metrics-certs\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768477 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-startup\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768499 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28ed9da3-cd29-4d80-9703-472bdbb3c64b-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-qpxrs\" (UID: \"28ed9da3-cd29-4d80-9703-472bdbb3c64b\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768525 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2282b46-452e-402e-b929-23875b572727-cert\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768553 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-conf\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.768943 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-conf\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.769161 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-sockets\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.770566 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-reloader\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.771245 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-metrics\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.771532 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-frr-startup\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.774551 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-metrics-certs\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.775226 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/28ed9da3-cd29-4d80-9703-472bdbb3c64b-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-qpxrs\" (UID: \"28ed9da3-cd29-4d80-9703-472bdbb3c64b\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.788987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ghc\" (UniqueName: \"kubernetes.io/projected/28ed9da3-cd29-4d80-9703-472bdbb3c64b-kube-api-access-z6ghc\") pod \"frr-k8s-webhook-server-7df86c4f6c-qpxrs\" (UID: \"28ed9da3-cd29-4d80-9703-472bdbb3c64b\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.798764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxg4j\" (UniqueName: \"kubernetes.io/projected/5d41fcac-7044-4f36-b9f8-0b656bb3bcca-kube-api-access-hxg4j\") pod \"frr-k8s-jhpnb\" (UID: \"5d41fcac-7044-4f36-b9f8-0b656bb3bcca\") " pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.816353 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.829121 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869755 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2282b46-452e-402e-b929-23875b572727-cert\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869813 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gljb\" (UniqueName: \"kubernetes.io/projected/24577bed-b34e-4419-9e9e-7068155ba0d1-kube-api-access-4gljb\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869872 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869887 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24577bed-b34e-4419-9e9e-7068155ba0d1-metallb-excludel2\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs9dw\" (UniqueName: \"kubernetes.io/projected/d2282b46-452e-402e-b929-23875b572727-kube-api-access-gs9dw\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869924 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2282b46-452e-402e-b929-23875b572727-metrics-certs\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.869943 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-metrics-certs\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: E0127 15:21:05.870325 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 27 15:21:05 crc kubenswrapper[4772]: E0127 15:21:05.870400 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist podName:24577bed-b34e-4419-9e9e-7068155ba0d1 nodeName:}" failed. No retries permitted until 2026-01-27 15:21:06.370382282 +0000 UTC m=+852.350991380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist") pod "speaker-cl54q" (UID: "24577bed-b34e-4419-9e9e-7068155ba0d1") : secret "metallb-memberlist" not found Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.872944 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.874046 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-metrics-certs\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.880912 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d2282b46-452e-402e-b929-23875b572727-metrics-certs\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.883492 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d2282b46-452e-402e-b929-23875b572727-cert\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.875424 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/24577bed-b34e-4419-9e9e-7068155ba0d1-metallb-excludel2\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.889135 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs9dw\" (UniqueName: \"kubernetes.io/projected/d2282b46-452e-402e-b929-23875b572727-kube-api-access-gs9dw\") pod \"controller-6968d8fdc4-vdg69\" (UID: \"d2282b46-452e-402e-b929-23875b572727\") " pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.896702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gljb\" (UniqueName: \"kubernetes.io/projected/24577bed-b34e-4419-9e9e-7068155ba0d1-kube-api-access-4gljb\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:05 crc kubenswrapper[4772]: I0127 15:21:05.952538 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.040621 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs"] Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.232414 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-vdg69"] Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.376147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:06 crc kubenswrapper[4772]: E0127 15:21:06.376294 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 27 15:21:06 crc kubenswrapper[4772]: E0127 15:21:06.376590 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist podName:24577bed-b34e-4419-9e9e-7068155ba0d1 nodeName:}" failed. No retries permitted until 2026-01-27 15:21:07.376567138 +0000 UTC m=+853.357176236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist") pod "speaker-cl54q" (UID: "24577bed-b34e-4419-9e9e-7068155ba0d1") : secret "metallb-memberlist" not found Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.778609 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" event={"ID":"28ed9da3-cd29-4d80-9703-472bdbb3c64b","Type":"ContainerStarted","Data":"780aca966d9b36d525c38f5fb2440499e489ad5d95ae0544152e425165926e70"} Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.780044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"bca95dea96b7d9c84b4d9aac1543290086fcf9e1cd46a039c27562c97260ed25"} Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.782078 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-vdg69" event={"ID":"d2282b46-452e-402e-b929-23875b572727","Type":"ContainerStarted","Data":"341afb24b9025101bf1fb56bdbe3737452cb4cc86d59fff9042e46c9401b3d15"} Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.782145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-vdg69" event={"ID":"d2282b46-452e-402e-b929-23875b572727","Type":"ContainerStarted","Data":"2c9ba4d1eaa3351a582a108e113dc0dc908f1c4c3bd2e5a07a9d796df90bb0fb"} Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.782161 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-vdg69" event={"ID":"d2282b46-452e-402e-b929-23875b572727","Type":"ContainerStarted","Data":"ceb56c6378c104b948c9d6b068a254ebf7b9f9fe3f2d109e87f59bc0984f28a7"} Jan 27 15:21:06 crc kubenswrapper[4772]: I0127 15:21:06.807294 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-vdg69" podStartSLOduration=1.807269733 podStartE2EDuration="1.807269733s" podCreationTimestamp="2026-01-27 15:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:21:06.803078132 +0000 UTC m=+852.783687250" watchObservedRunningTime="2026-01-27 15:21:06.807269733 +0000 UTC m=+852.787878851" Jan 27 15:21:07 crc kubenswrapper[4772]: I0127 15:21:07.390065 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:07 crc kubenswrapper[4772]: I0127 15:21:07.399839 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/24577bed-b34e-4419-9e9e-7068155ba0d1-memberlist\") pod \"speaker-cl54q\" (UID: \"24577bed-b34e-4419-9e9e-7068155ba0d1\") " pod="metallb-system/speaker-cl54q" Jan 27 15:21:07 crc kubenswrapper[4772]: I0127 15:21:07.409291 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-cl54q" Jan 27 15:21:07 crc kubenswrapper[4772]: I0127 15:21:07.814289 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cl54q" event={"ID":"24577bed-b34e-4419-9e9e-7068155ba0d1","Type":"ContainerStarted","Data":"73276f07d5615044b17ec0fef1b7f7d6010cd89f4ec2ae0dd0ddf1ddd8568bde"} Jan 27 15:21:07 crc kubenswrapper[4772]: I0127 15:21:07.814331 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:07 crc kubenswrapper[4772]: I0127 15:21:07.814347 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cl54q" event={"ID":"24577bed-b34e-4419-9e9e-7068155ba0d1","Type":"ContainerStarted","Data":"2240600dba6c2d78ad94c28d4ef41d4fc16fd68d2e6744503d736eb88ce1582e"} Jan 27 15:21:08 crc kubenswrapper[4772]: I0127 15:21:08.822643 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-cl54q" event={"ID":"24577bed-b34e-4419-9e9e-7068155ba0d1","Type":"ContainerStarted","Data":"2b5b3a1b39aae66538d36485921a8a98cabe7ffbece55252ae7e9dfb5706bbd5"} Jan 27 15:21:09 crc kubenswrapper[4772]: I0127 15:21:09.849660 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-cl54q" Jan 27 15:21:14 crc kubenswrapper[4772]: I0127 15:21:14.686914 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-cl54q" podStartSLOduration=9.686898099 podStartE2EDuration="9.686898099s" podCreationTimestamp="2026-01-27 15:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:21:08.846138009 +0000 UTC m=+854.826747127" watchObservedRunningTime="2026-01-27 15:21:14.686898099 +0000 UTC m=+860.667507197" Jan 27 15:21:14 crc kubenswrapper[4772]: I0127 15:21:14.889600 4772 generic.go:334] "Generic (PLEG): container finished" podID="5d41fcac-7044-4f36-b9f8-0b656bb3bcca" containerID="4152684fe40ccfc62765e2d701064a0eddfcc9efca765746312af1435e2e0743" exitCode=0 Jan 27 15:21:14 crc kubenswrapper[4772]: I0127 15:21:14.889713 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerDied","Data":"4152684fe40ccfc62765e2d701064a0eddfcc9efca765746312af1435e2e0743"} Jan 27 15:21:14 crc kubenswrapper[4772]: I0127 15:21:14.893526 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" event={"ID":"28ed9da3-cd29-4d80-9703-472bdbb3c64b","Type":"ContainerStarted","Data":"e6f53ee01d2b56f4345ee3366629355e1ed57bd397dfeb870e6198fbb52190c3"} Jan 27 15:21:14 crc kubenswrapper[4772]: I0127 15:21:14.894138 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:14 crc kubenswrapper[4772]: I0127 15:21:14.958815 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" podStartSLOduration=2.282162238 podStartE2EDuration="9.958792538s" podCreationTimestamp="2026-01-27 15:21:05 +0000 UTC" firstStartedPulling="2026-01-27 15:21:06.046780155 +0000 UTC m=+852.027389253" lastFinishedPulling="2026-01-27 15:21:13.723410455 +0000 UTC m=+859.704019553" observedRunningTime="2026-01-27 15:21:14.952325732 +0000 UTC m=+860.932934840" watchObservedRunningTime="2026-01-27 15:21:14.958792538 +0000 UTC m=+860.939401646" Jan 27 15:21:15 crc kubenswrapper[4772]: I0127 15:21:15.915843 4772 generic.go:334] "Generic (PLEG): container finished" podID="5d41fcac-7044-4f36-b9f8-0b656bb3bcca" containerID="9967ee6e447cf31d2e31e46708ac245669b58f5587a46e4d3ff44b9263ca1a06" exitCode=0 Jan 27 15:21:15 crc kubenswrapper[4772]: I0127 15:21:15.917547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerDied","Data":"9967ee6e447cf31d2e31e46708ac245669b58f5587a46e4d3ff44b9263ca1a06"} Jan 27 15:21:16 crc kubenswrapper[4772]: I0127 15:21:16.926539 4772 generic.go:334] "Generic (PLEG): container finished" podID="5d41fcac-7044-4f36-b9f8-0b656bb3bcca" containerID="e945c2714806c1f70092a08fa71ca1b875f7d14aaaea2b52d560d0eba8daf73f" exitCode=0 Jan 27 15:21:16 crc kubenswrapper[4772]: I0127 15:21:16.926616 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerDied","Data":"e945c2714806c1f70092a08fa71ca1b875f7d14aaaea2b52d560d0eba8daf73f"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.413289 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-cl54q" Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937128 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"bfed979fcfe5bd812af2efe31e19fb0d984b1fd6aaa19f7833a20a8f64b840b0"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937228 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"4436e7005084c425e3a9d35a46fa4e7c4f85b3ed742ac9d293c8d22b9b849826"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937249 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"e923f314d195a301af3bc586b97562896538b03ab9e799d82ee601b7992f2baa"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937267 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"0b029840d4b2e66b85e73893e9bb791afc17bfb135aa1f86951376b56341b53e"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937284 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"2789caab35376e23e9fc736f8a3a208ae2e5b419ee54bf1ea2c19c82e715d467"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937310 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jhpnb" event={"ID":"5d41fcac-7044-4f36-b9f8-0b656bb3bcca","Type":"ContainerStarted","Data":"e4c82b762cf9effb2b151661ce0a466f1b96207b155c3c9f65d48e7046a558a3"} Jan 27 15:21:17 crc kubenswrapper[4772]: I0127 15:21:17.937340 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.800081 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jhpnb" podStartSLOduration=6.066893337 podStartE2EDuration="13.800062963s" podCreationTimestamp="2026-01-27 15:21:05 +0000 UTC" firstStartedPulling="2026-01-27 15:21:06.003906252 +0000 UTC m=+851.984515360" lastFinishedPulling="2026-01-27 15:21:13.737075888 +0000 UTC m=+859.717684986" observedRunningTime="2026-01-27 15:21:17.989767543 +0000 UTC m=+863.970376751" watchObservedRunningTime="2026-01-27 15:21:18.800062963 +0000 UTC m=+864.780672061" Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.804105 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg"] Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.805150 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.807539 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.814299 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg"] Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.953667 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl89w\" (UniqueName: \"kubernetes.io/projected/44b239be-466d-4995-9c33-38d68a00550d-kube-api-access-rl89w\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.953754 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:18 crc kubenswrapper[4772]: I0127 15:21:18.953782 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.055545 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.055620 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.055715 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl89w\" (UniqueName: \"kubernetes.io/projected/44b239be-466d-4995-9c33-38d68a00550d-kube-api-access-rl89w\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.056137 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.056257 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.089067 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl89w\" (UniqueName: \"kubernetes.io/projected/44b239be-466d-4995-9c33-38d68a00550d-kube-api-access-rl89w\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.120809 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.363365 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg"] Jan 27 15:21:19 crc kubenswrapper[4772]: W0127 15:21:19.371018 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b239be_466d_4995_9c33_38d68a00550d.slice/crio-ef16ee5c9b0f6269a183df8b440e38235f63a12a2af4b9d1a81c3a7c37222be9 WatchSource:0}: Error finding container ef16ee5c9b0f6269a183df8b440e38235f63a12a2af4b9d1a81c3a7c37222be9: Status 404 returned error can't find the container with id ef16ee5c9b0f6269a183df8b440e38235f63a12a2af4b9d1a81c3a7c37222be9 Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.952863 4772 generic.go:334] "Generic (PLEG): container finished" podID="44b239be-466d-4995-9c33-38d68a00550d" containerID="b21c4d3041de620d506243172af9adcd979dc0a44622b73db1c752a3490807be" exitCode=0 Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.952991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" event={"ID":"44b239be-466d-4995-9c33-38d68a00550d","Type":"ContainerDied","Data":"b21c4d3041de620d506243172af9adcd979dc0a44622b73db1c752a3490807be"} Jan 27 15:21:19 crc kubenswrapper[4772]: I0127 15:21:19.953214 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" event={"ID":"44b239be-466d-4995-9c33-38d68a00550d","Type":"ContainerStarted","Data":"ef16ee5c9b0f6269a183df8b440e38235f63a12a2af4b9d1a81c3a7c37222be9"} Jan 27 15:21:20 crc kubenswrapper[4772]: I0127 15:21:20.816712 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:20 crc kubenswrapper[4772]: I0127 15:21:20.879362 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:23 crc kubenswrapper[4772]: I0127 15:21:23.995303 4772 generic.go:334] "Generic (PLEG): container finished" podID="44b239be-466d-4995-9c33-38d68a00550d" containerID="cf0e54cd4817a89906c4991b53edd466f93be54e83126f84100f6a7ca6530b41" exitCode=0 Jan 27 15:21:23 crc kubenswrapper[4772]: I0127 15:21:23.995371 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" event={"ID":"44b239be-466d-4995-9c33-38d68a00550d","Type":"ContainerDied","Data":"cf0e54cd4817a89906c4991b53edd466f93be54e83126f84100f6a7ca6530b41"} Jan 27 15:21:25 crc kubenswrapper[4772]: I0127 15:21:25.014155 4772 generic.go:334] "Generic (PLEG): container finished" podID="44b239be-466d-4995-9c33-38d68a00550d" containerID="9c82381351805aef1a46dfd050b839a3c77763cc224fe4c1bf4a211c797b81c0" exitCode=0 Jan 27 15:21:25 crc kubenswrapper[4772]: I0127 15:21:25.014311 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" event={"ID":"44b239be-466d-4995-9c33-38d68a00550d","Type":"ContainerDied","Data":"9c82381351805aef1a46dfd050b839a3c77763cc224fe4c1bf4a211c797b81c0"} Jan 27 15:21:25 crc kubenswrapper[4772]: I0127 15:21:25.838375 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-qpxrs" Jan 27 15:21:25 crc kubenswrapper[4772]: I0127 15:21:25.961714 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-vdg69" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.305292 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.476987 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-util\") pod \"44b239be-466d-4995-9c33-38d68a00550d\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.477092 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-bundle\") pod \"44b239be-466d-4995-9c33-38d68a00550d\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.477119 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl89w\" (UniqueName: \"kubernetes.io/projected/44b239be-466d-4995-9c33-38d68a00550d-kube-api-access-rl89w\") pod \"44b239be-466d-4995-9c33-38d68a00550d\" (UID: \"44b239be-466d-4995-9c33-38d68a00550d\") " Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.477987 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-bundle" (OuterVolumeSpecName: "bundle") pod "44b239be-466d-4995-9c33-38d68a00550d" (UID: "44b239be-466d-4995-9c33-38d68a00550d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.482938 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b239be-466d-4995-9c33-38d68a00550d-kube-api-access-rl89w" (OuterVolumeSpecName: "kube-api-access-rl89w") pod "44b239be-466d-4995-9c33-38d68a00550d" (UID: "44b239be-466d-4995-9c33-38d68a00550d"). InnerVolumeSpecName "kube-api-access-rl89w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.493591 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-util" (OuterVolumeSpecName: "util") pod "44b239be-466d-4995-9c33-38d68a00550d" (UID: "44b239be-466d-4995-9c33-38d68a00550d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.578434 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.578475 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b239be-466d-4995-9c33-38d68a00550d-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:21:26 crc kubenswrapper[4772]: I0127 15:21:26.578494 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl89w\" (UniqueName: \"kubernetes.io/projected/44b239be-466d-4995-9c33-38d68a00550d-kube-api-access-rl89w\") on node \"crc\" DevicePath \"\"" Jan 27 15:21:27 crc kubenswrapper[4772]: I0127 15:21:27.032987 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" event={"ID":"44b239be-466d-4995-9c33-38d68a00550d","Type":"ContainerDied","Data":"ef16ee5c9b0f6269a183df8b440e38235f63a12a2af4b9d1a81c3a7c37222be9"} Jan 27 15:21:27 crc kubenswrapper[4772]: I0127 15:21:27.033043 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef16ee5c9b0f6269a183df8b440e38235f63a12a2af4b9d1a81c3a7c37222be9" Jan 27 15:21:27 crc kubenswrapper[4772]: I0127 15:21:27.033128 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.065355 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9"] Jan 27 15:21:32 crc kubenswrapper[4772]: E0127 15:21:32.065920 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="extract" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.065934 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="extract" Jan 27 15:21:32 crc kubenswrapper[4772]: E0127 15:21:32.065958 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="pull" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.065965 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="pull" Jan 27 15:21:32 crc kubenswrapper[4772]: E0127 15:21:32.065978 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="util" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.065987 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="util" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.066109 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b239be-466d-4995-9c33-38d68a00550d" containerName="extract" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.066594 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.071226 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-52slv" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.073037 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.073519 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.079045 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zgq5\" (UniqueName: \"kubernetes.io/projected/649069f7-0947-4089-9e65-ae192e952f8e-kube-api-access-8zgq5\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fh8c9\" (UID: \"649069f7-0947-4089-9e65-ae192e952f8e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.079118 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/649069f7-0947-4089-9e65-ae192e952f8e-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fh8c9\" (UID: \"649069f7-0947-4089-9e65-ae192e952f8e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.095873 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9"] Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.180528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/649069f7-0947-4089-9e65-ae192e952f8e-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fh8c9\" (UID: \"649069f7-0947-4089-9e65-ae192e952f8e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.180640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zgq5\" (UniqueName: \"kubernetes.io/projected/649069f7-0947-4089-9e65-ae192e952f8e-kube-api-access-8zgq5\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fh8c9\" (UID: \"649069f7-0947-4089-9e65-ae192e952f8e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.181138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/649069f7-0947-4089-9e65-ae192e952f8e-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fh8c9\" (UID: \"649069f7-0947-4089-9e65-ae192e952f8e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.221925 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zgq5\" (UniqueName: \"kubernetes.io/projected/649069f7-0947-4089-9e65-ae192e952f8e-kube-api-access-8zgq5\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fh8c9\" (UID: \"649069f7-0947-4089-9e65-ae192e952f8e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.381696 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" Jan 27 15:21:32 crc kubenswrapper[4772]: I0127 15:21:32.637585 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9"] Jan 27 15:21:32 crc kubenswrapper[4772]: W0127 15:21:32.643380 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod649069f7_0947_4089_9e65_ae192e952f8e.slice/crio-bf7f6f71230081901c74a00d96446a7b7c53a966a8f273c8c4a01df45ebc987a WatchSource:0}: Error finding container bf7f6f71230081901c74a00d96446a7b7c53a966a8f273c8c4a01df45ebc987a: Status 404 returned error can't find the container with id bf7f6f71230081901c74a00d96446a7b7c53a966a8f273c8c4a01df45ebc987a Jan 27 15:21:33 crc kubenswrapper[4772]: I0127 15:21:33.069160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" event={"ID":"649069f7-0947-4089-9e65-ae192e952f8e","Type":"ContainerStarted","Data":"bf7f6f71230081901c74a00d96446a7b7c53a966a8f273c8c4a01df45ebc987a"} Jan 27 15:21:35 crc kubenswrapper[4772]: I0127 15:21:35.821106 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jhpnb" Jan 27 15:21:40 crc kubenswrapper[4772]: I0127 15:21:40.124267 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" event={"ID":"649069f7-0947-4089-9e65-ae192e952f8e","Type":"ContainerStarted","Data":"8e9ab94d2d2b624ead64b2485da2e8caa695e68f615d94bc2982b5a1ceebe2ed"} Jan 27 15:21:40 crc kubenswrapper[4772]: I0127 15:21:40.142723 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fh8c9" podStartSLOduration=1.22977818 podStartE2EDuration="8.14269739s" podCreationTimestamp="2026-01-27 15:21:32 +0000 UTC" firstStartedPulling="2026-01-27 15:21:32.648100354 +0000 UTC m=+878.628709452" lastFinishedPulling="2026-01-27 15:21:39.561019564 +0000 UTC m=+885.541628662" observedRunningTime="2026-01-27 15:21:40.141122274 +0000 UTC m=+886.121731392" watchObservedRunningTime="2026-01-27 15:21:40.14269739 +0000 UTC m=+886.123306518" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.554650 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-4886f"] Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.556318 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.561050 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.561087 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.565510 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-4886f"] Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.566818 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8r264" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.745565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqlbf\" (UniqueName: \"kubernetes.io/projected/ab67a3dd-5a79-400f-9b27-294ef256823d-kube-api-access-hqlbf\") pod \"cert-manager-webhook-f4fb5df64-4886f\" (UID: \"ab67a3dd-5a79-400f-9b27-294ef256823d\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.745679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab67a3dd-5a79-400f-9b27-294ef256823d-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-4886f\" (UID: \"ab67a3dd-5a79-400f-9b27-294ef256823d\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.848394 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab67a3dd-5a79-400f-9b27-294ef256823d-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-4886f\" (UID: \"ab67a3dd-5a79-400f-9b27-294ef256823d\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.848530 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqlbf\" (UniqueName: \"kubernetes.io/projected/ab67a3dd-5a79-400f-9b27-294ef256823d-kube-api-access-hqlbf\") pod \"cert-manager-webhook-f4fb5df64-4886f\" (UID: \"ab67a3dd-5a79-400f-9b27-294ef256823d\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.872895 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab67a3dd-5a79-400f-9b27-294ef256823d-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-4886f\" (UID: \"ab67a3dd-5a79-400f-9b27-294ef256823d\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.874324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqlbf\" (UniqueName: \"kubernetes.io/projected/ab67a3dd-5a79-400f-9b27-294ef256823d-kube-api-access-hqlbf\") pod \"cert-manager-webhook-f4fb5df64-4886f\" (UID: \"ab67a3dd-5a79-400f-9b27-294ef256823d\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:43 crc kubenswrapper[4772]: I0127 15:21:43.899216 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:44 crc kubenswrapper[4772]: I0127 15:21:44.119100 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-4886f"] Jan 27 15:21:44 crc kubenswrapper[4772]: I0127 15:21:44.153962 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" event={"ID":"ab67a3dd-5a79-400f-9b27-294ef256823d","Type":"ContainerStarted","Data":"064f31700f599bf883e42ed7b95cbf043907cbd1f50c0ddff52a145423ce43af"} Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.194250 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj"] Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.195260 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.197379 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8pmkj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.201626 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj"] Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.276856 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7dp\" (UniqueName: \"kubernetes.io/projected/8d401dfc-33d3-416f-abba-cad4a1e173bd-kube-api-access-fk7dp\") pod \"cert-manager-cainjector-855d9ccff4-s7ksj\" (UID: \"8d401dfc-33d3-416f-abba-cad4a1e173bd\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.276957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d401dfc-33d3-416f-abba-cad4a1e173bd-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-s7ksj\" (UID: \"8d401dfc-33d3-416f-abba-cad4a1e173bd\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.377803 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7dp\" (UniqueName: \"kubernetes.io/projected/8d401dfc-33d3-416f-abba-cad4a1e173bd-kube-api-access-fk7dp\") pod \"cert-manager-cainjector-855d9ccff4-s7ksj\" (UID: \"8d401dfc-33d3-416f-abba-cad4a1e173bd\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.378108 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d401dfc-33d3-416f-abba-cad4a1e173bd-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-s7ksj\" (UID: \"8d401dfc-33d3-416f-abba-cad4a1e173bd\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.397107 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d401dfc-33d3-416f-abba-cad4a1e173bd-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-s7ksj\" (UID: \"8d401dfc-33d3-416f-abba-cad4a1e173bd\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.398304 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7dp\" (UniqueName: \"kubernetes.io/projected/8d401dfc-33d3-416f-abba-cad4a1e173bd-kube-api-access-fk7dp\") pod \"cert-manager-cainjector-855d9ccff4-s7ksj\" (UID: \"8d401dfc-33d3-416f-abba-cad4a1e173bd\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.517848 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" Jan 27 15:21:46 crc kubenswrapper[4772]: I0127 15:21:46.738087 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj"] Jan 27 15:21:47 crc kubenswrapper[4772]: I0127 15:21:47.176822 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" event={"ID":"8d401dfc-33d3-416f-abba-cad4a1e173bd","Type":"ContainerStarted","Data":"1fe539967b131aa9595b66edc664c29e3406effab2e49a53e7c776bc52a08faa"} Jan 27 15:21:53 crc kubenswrapper[4772]: I0127 15:21:53.207699 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" event={"ID":"8d401dfc-33d3-416f-abba-cad4a1e173bd","Type":"ContainerStarted","Data":"2af0c1e9f079137b54a488e65ed83dd57de1eb3f2452bab5e7bd7083aefa0e52"} Jan 27 15:21:53 crc kubenswrapper[4772]: I0127 15:21:53.210129 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" event={"ID":"ab67a3dd-5a79-400f-9b27-294ef256823d","Type":"ContainerStarted","Data":"cd9cd0ee0baebdf60fd4c2c607e947c53e17c28f3d2ced16c697a54ad535f73b"} Jan 27 15:21:53 crc kubenswrapper[4772]: I0127 15:21:53.210252 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:53 crc kubenswrapper[4772]: I0127 15:21:53.251592 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-s7ksj" podStartSLOduration=1.980454093 podStartE2EDuration="7.251572894s" podCreationTimestamp="2026-01-27 15:21:46 +0000 UTC" firstStartedPulling="2026-01-27 15:21:46.754332726 +0000 UTC m=+892.734941824" lastFinishedPulling="2026-01-27 15:21:52.025451517 +0000 UTC m=+898.006060625" observedRunningTime="2026-01-27 15:21:52.220357672 +0000 UTC m=+898.200966770" watchObservedRunningTime="2026-01-27 15:21:53.251572894 +0000 UTC m=+899.232181992" Jan 27 15:21:53 crc kubenswrapper[4772]: I0127 15:21:53.253243 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" podStartSLOduration=2.328066595 podStartE2EDuration="10.253235712s" podCreationTimestamp="2026-01-27 15:21:43 +0000 UTC" firstStartedPulling="2026-01-27 15:21:44.13853726 +0000 UTC m=+890.119146348" lastFinishedPulling="2026-01-27 15:21:52.063706367 +0000 UTC m=+898.044315465" observedRunningTime="2026-01-27 15:21:53.249893816 +0000 UTC m=+899.230502934" watchObservedRunningTime="2026-01-27 15:21:53.253235712 +0000 UTC m=+899.233844810" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.309495 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-bmz8p"] Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.310731 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.313995 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wrc5s" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.322389 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-bmz8p"] Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.347862 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b51171e-5d65-415e-8052-3cc8991f5de4-bound-sa-token\") pod \"cert-manager-86cb77c54b-bmz8p\" (UID: \"7b51171e-5d65-415e-8052-3cc8991f5de4\") " pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.347925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9rcj\" (UniqueName: \"kubernetes.io/projected/7b51171e-5d65-415e-8052-3cc8991f5de4-kube-api-access-x9rcj\") pod \"cert-manager-86cb77c54b-bmz8p\" (UID: \"7b51171e-5d65-415e-8052-3cc8991f5de4\") " pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.448785 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b51171e-5d65-415e-8052-3cc8991f5de4-bound-sa-token\") pod \"cert-manager-86cb77c54b-bmz8p\" (UID: \"7b51171e-5d65-415e-8052-3cc8991f5de4\") " pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.448863 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9rcj\" (UniqueName: \"kubernetes.io/projected/7b51171e-5d65-415e-8052-3cc8991f5de4-kube-api-access-x9rcj\") pod \"cert-manager-86cb77c54b-bmz8p\" (UID: \"7b51171e-5d65-415e-8052-3cc8991f5de4\") " pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.467064 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9rcj\" (UniqueName: \"kubernetes.io/projected/7b51171e-5d65-415e-8052-3cc8991f5de4-kube-api-access-x9rcj\") pod \"cert-manager-86cb77c54b-bmz8p\" (UID: \"7b51171e-5d65-415e-8052-3cc8991f5de4\") " pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.470960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7b51171e-5d65-415e-8052-3cc8991f5de4-bound-sa-token\") pod \"cert-manager-86cb77c54b-bmz8p\" (UID: \"7b51171e-5d65-415e-8052-3cc8991f5de4\") " pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.634945 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wrc5s" Jan 27 15:21:54 crc kubenswrapper[4772]: I0127 15:21:54.643832 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-bmz8p" Jan 27 15:21:55 crc kubenswrapper[4772]: I0127 15:21:55.046015 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-bmz8p"] Jan 27 15:21:55 crc kubenswrapper[4772]: W0127 15:21:55.050303 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b51171e_5d65_415e_8052_3cc8991f5de4.slice/crio-9f7057cb198afc00edbb3717c98144058c5b710f4507c7ac0b1afce1b6e73def WatchSource:0}: Error finding container 9f7057cb198afc00edbb3717c98144058c5b710f4507c7ac0b1afce1b6e73def: Status 404 returned error can't find the container with id 9f7057cb198afc00edbb3717c98144058c5b710f4507c7ac0b1afce1b6e73def Jan 27 15:21:55 crc kubenswrapper[4772]: I0127 15:21:55.225256 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-bmz8p" event={"ID":"7b51171e-5d65-415e-8052-3cc8991f5de4","Type":"ContainerStarted","Data":"b9cc0505c7e78010c61f93057c4ae5e6a5ca8b108d2e0825f7101acb70d5db1d"} Jan 27 15:21:55 crc kubenswrapper[4772]: I0127 15:21:55.225657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-bmz8p" event={"ID":"7b51171e-5d65-415e-8052-3cc8991f5de4","Type":"ContainerStarted","Data":"9f7057cb198afc00edbb3717c98144058c5b710f4507c7ac0b1afce1b6e73def"} Jan 27 15:21:55 crc kubenswrapper[4772]: I0127 15:21:55.246450 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-bmz8p" podStartSLOduration=1.246429646 podStartE2EDuration="1.246429646s" podCreationTimestamp="2026-01-27 15:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:21:55.244637535 +0000 UTC m=+901.225246643" watchObservedRunningTime="2026-01-27 15:21:55.246429646 +0000 UTC m=+901.227038744" Jan 27 15:21:57 crc kubenswrapper[4772]: I0127 15:21:57.933783 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pm7m"] Jan 27 15:21:57 crc kubenswrapper[4772]: I0127 15:21:57.935392 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:57 crc kubenswrapper[4772]: I0127 15:21:57.955059 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pm7m"] Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.002917 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxgxq\" (UniqueName: \"kubernetes.io/projected/bdc687da-0857-4070-a27a-90b08ca108c9-kube-api-access-gxgxq\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.003006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-utilities\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.003093 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-catalog-content\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.104821 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxgxq\" (UniqueName: \"kubernetes.io/projected/bdc687da-0857-4070-a27a-90b08ca108c9-kube-api-access-gxgxq\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.104927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-utilities\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.104988 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-catalog-content\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.105453 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-utilities\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.105534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-catalog-content\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.128364 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxgxq\" (UniqueName: \"kubernetes.io/projected/bdc687da-0857-4070-a27a-90b08ca108c9-kube-api-access-gxgxq\") pod \"redhat-marketplace-6pm7m\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.261139 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.507236 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pm7m"] Jan 27 15:21:58 crc kubenswrapper[4772]: I0127 15:21:58.901399 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-4886f" Jan 27 15:21:59 crc kubenswrapper[4772]: I0127 15:21:59.252488 4772 generic.go:334] "Generic (PLEG): container finished" podID="bdc687da-0857-4070-a27a-90b08ca108c9" containerID="94686f897895f9959663a28bfa07c2aaa2acb444714ed141333b20d65eed7fc2" exitCode=0 Jan 27 15:21:59 crc kubenswrapper[4772]: I0127 15:21:59.252549 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pm7m" event={"ID":"bdc687da-0857-4070-a27a-90b08ca108c9","Type":"ContainerDied","Data":"94686f897895f9959663a28bfa07c2aaa2acb444714ed141333b20d65eed7fc2"} Jan 27 15:21:59 crc kubenswrapper[4772]: I0127 15:21:59.252585 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pm7m" event={"ID":"bdc687da-0857-4070-a27a-90b08ca108c9","Type":"ContainerStarted","Data":"41d3f481a49e568f6c8ae964bb8f25577c85d9700b3c5df1b8ce8fff898b4f29"} Jan 27 15:22:01 crc kubenswrapper[4772]: I0127 15:22:01.272294 4772 generic.go:334] "Generic (PLEG): container finished" podID="bdc687da-0857-4070-a27a-90b08ca108c9" containerID="6a60cf692ab3a3eae06e2802f1931e40f49d38619fda7b12604a36a8f58a1dbe" exitCode=0 Jan 27 15:22:01 crc kubenswrapper[4772]: I0127 15:22:01.272442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pm7m" event={"ID":"bdc687da-0857-4070-a27a-90b08ca108c9","Type":"ContainerDied","Data":"6a60cf692ab3a3eae06e2802f1931e40f49d38619fda7b12604a36a8f58a1dbe"} Jan 27 15:22:02 crc kubenswrapper[4772]: I0127 15:22:02.280962 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pm7m" event={"ID":"bdc687da-0857-4070-a27a-90b08ca108c9","Type":"ContainerStarted","Data":"5b7851c728141b28bc16da228e27af7b01b686d419421b07643b0ccbbf7acd5a"} Jan 27 15:22:02 crc kubenswrapper[4772]: I0127 15:22:02.314107 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pm7m" podStartSLOduration=2.8753379199999998 podStartE2EDuration="5.314081786s" podCreationTimestamp="2026-01-27 15:21:57 +0000 UTC" firstStartedPulling="2026-01-27 15:21:59.254256461 +0000 UTC m=+905.234865599" lastFinishedPulling="2026-01-27 15:22:01.693000327 +0000 UTC m=+907.673609465" observedRunningTime="2026-01-27 15:22:02.306692724 +0000 UTC m=+908.287301842" watchObservedRunningTime="2026-01-27 15:22:02.314081786 +0000 UTC m=+908.294690924" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.324946 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7lq2b"] Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.327094 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.334997 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.335196 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.335484 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-4m6p5" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.336503 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7lq2b"] Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.411822 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5hnl\" (UniqueName: \"kubernetes.io/projected/13489498-3c32-4ef1-baf5-99f6907d07e4-kube-api-access-j5hnl\") pod \"openstack-operator-index-7lq2b\" (UID: \"13489498-3c32-4ef1-baf5-99f6907d07e4\") " pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.513815 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5hnl\" (UniqueName: \"kubernetes.io/projected/13489498-3c32-4ef1-baf5-99f6907d07e4-kube-api-access-j5hnl\") pod \"openstack-operator-index-7lq2b\" (UID: \"13489498-3c32-4ef1-baf5-99f6907d07e4\") " pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.547943 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5hnl\" (UniqueName: \"kubernetes.io/projected/13489498-3c32-4ef1-baf5-99f6907d07e4-kube-api-access-j5hnl\") pod \"openstack-operator-index-7lq2b\" (UID: \"13489498-3c32-4ef1-baf5-99f6907d07e4\") " pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:05 crc kubenswrapper[4772]: I0127 15:22:05.663949 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:06 crc kubenswrapper[4772]: I0127 15:22:06.140478 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7lq2b"] Jan 27 15:22:06 crc kubenswrapper[4772]: I0127 15:22:06.307247 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lq2b" event={"ID":"13489498-3c32-4ef1-baf5-99f6907d07e4","Type":"ContainerStarted","Data":"055263e39566289af029844504c4a8375ca8aa346f646c3a569e5ad465cedaa7"} Jan 27 15:22:08 crc kubenswrapper[4772]: I0127 15:22:08.262081 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:22:08 crc kubenswrapper[4772]: I0127 15:22:08.262497 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:22:08 crc kubenswrapper[4772]: I0127 15:22:08.302506 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:22:08 crc kubenswrapper[4772]: I0127 15:22:08.367468 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:22:09 crc kubenswrapper[4772]: I0127 15:22:09.326455 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lq2b" event={"ID":"13489498-3c32-4ef1-baf5-99f6907d07e4","Type":"ContainerStarted","Data":"d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4"} Jan 27 15:22:09 crc kubenswrapper[4772]: I0127 15:22:09.342964 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7lq2b" podStartSLOduration=1.7184273129999998 podStartE2EDuration="4.34293812s" podCreationTimestamp="2026-01-27 15:22:05 +0000 UTC" firstStartedPulling="2026-01-27 15:22:06.208593123 +0000 UTC m=+912.189202221" lastFinishedPulling="2026-01-27 15:22:08.83310389 +0000 UTC m=+914.813713028" observedRunningTime="2026-01-27 15:22:09.341386925 +0000 UTC m=+915.321996063" watchObservedRunningTime="2026-01-27 15:22:09.34293812 +0000 UTC m=+915.323547238" Jan 27 15:22:09 crc kubenswrapper[4772]: I0127 15:22:09.902822 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7lq2b"] Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.512119 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vs9rk"] Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.513037 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.522695 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vs9rk"] Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.585806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9lb\" (UniqueName: \"kubernetes.io/projected/96fcf3a5-2584-4590-8057-9c18a9866bd4-kube-api-access-mq9lb\") pod \"openstack-operator-index-vs9rk\" (UID: \"96fcf3a5-2584-4590-8057-9c18a9866bd4\") " pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.687418 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9lb\" (UniqueName: \"kubernetes.io/projected/96fcf3a5-2584-4590-8057-9c18a9866bd4-kube-api-access-mq9lb\") pod \"openstack-operator-index-vs9rk\" (UID: \"96fcf3a5-2584-4590-8057-9c18a9866bd4\") " pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.722262 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9lb\" (UniqueName: \"kubernetes.io/projected/96fcf3a5-2584-4590-8057-9c18a9866bd4-kube-api-access-mq9lb\") pod \"openstack-operator-index-vs9rk\" (UID: \"96fcf3a5-2584-4590-8057-9c18a9866bd4\") " pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:10 crc kubenswrapper[4772]: I0127 15:22:10.843271 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:11 crc kubenswrapper[4772]: W0127 15:22:11.287248 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96fcf3a5_2584_4590_8057_9c18a9866bd4.slice/crio-51e818974842a23b8b8c4b017ed47491446211a1c89d4658b8224801df3d4faf WatchSource:0}: Error finding container 51e818974842a23b8b8c4b017ed47491446211a1c89d4658b8224801df3d4faf: Status 404 returned error can't find the container with id 51e818974842a23b8b8c4b017ed47491446211a1c89d4658b8224801df3d4faf Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.287736 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vs9rk"] Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.341430 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vs9rk" event={"ID":"96fcf3a5-2584-4590-8057-9c18a9866bd4","Type":"ContainerStarted","Data":"51e818974842a23b8b8c4b017ed47491446211a1c89d4658b8224801df3d4faf"} Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.341664 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7lq2b" podUID="13489498-3c32-4ef1-baf5-99f6907d07e4" containerName="registry-server" containerID="cri-o://d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4" gracePeriod=2 Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.692534 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.804580 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5hnl\" (UniqueName: \"kubernetes.io/projected/13489498-3c32-4ef1-baf5-99f6907d07e4-kube-api-access-j5hnl\") pod \"13489498-3c32-4ef1-baf5-99f6907d07e4\" (UID: \"13489498-3c32-4ef1-baf5-99f6907d07e4\") " Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.809641 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13489498-3c32-4ef1-baf5-99f6907d07e4-kube-api-access-j5hnl" (OuterVolumeSpecName: "kube-api-access-j5hnl") pod "13489498-3c32-4ef1-baf5-99f6907d07e4" (UID: "13489498-3c32-4ef1-baf5-99f6907d07e4"). InnerVolumeSpecName "kube-api-access-j5hnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:22:11 crc kubenswrapper[4772]: I0127 15:22:11.906447 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5hnl\" (UniqueName: \"kubernetes.io/projected/13489498-3c32-4ef1-baf5-99f6907d07e4-kube-api-access-j5hnl\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.058746 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.059257 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.107645 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pm7m"] Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.107983 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pm7m" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="registry-server" containerID="cri-o://5b7851c728141b28bc16da228e27af7b01b686d419421b07643b0ccbbf7acd5a" gracePeriod=2 Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.350951 4772 generic.go:334] "Generic (PLEG): container finished" podID="bdc687da-0857-4070-a27a-90b08ca108c9" containerID="5b7851c728141b28bc16da228e27af7b01b686d419421b07643b0ccbbf7acd5a" exitCode=0 Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.351044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pm7m" event={"ID":"bdc687da-0857-4070-a27a-90b08ca108c9","Type":"ContainerDied","Data":"5b7851c728141b28bc16da228e27af7b01b686d419421b07643b0ccbbf7acd5a"} Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.352538 4772 generic.go:334] "Generic (PLEG): container finished" podID="13489498-3c32-4ef1-baf5-99f6907d07e4" containerID="d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4" exitCode=0 Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.352614 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7lq2b" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.352625 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lq2b" event={"ID":"13489498-3c32-4ef1-baf5-99f6907d07e4","Type":"ContainerDied","Data":"d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4"} Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.352666 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7lq2b" event={"ID":"13489498-3c32-4ef1-baf5-99f6907d07e4","Type":"ContainerDied","Data":"055263e39566289af029844504c4a8375ca8aa346f646c3a569e5ad465cedaa7"} Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.352697 4772 scope.go:117] "RemoveContainer" containerID="d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.354299 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vs9rk" event={"ID":"96fcf3a5-2584-4590-8057-9c18a9866bd4","Type":"ContainerStarted","Data":"38b35858a30e11e3719ec119912ad142df8868545490ca4c217b5258f55412a0"} Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.381953 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vs9rk" podStartSLOduration=2.305204858 podStartE2EDuration="2.381929635s" podCreationTimestamp="2026-01-27 15:22:10 +0000 UTC" firstStartedPulling="2026-01-27 15:22:11.295899087 +0000 UTC m=+917.276508205" lastFinishedPulling="2026-01-27 15:22:11.372623884 +0000 UTC m=+917.353232982" observedRunningTime="2026-01-27 15:22:12.378403784 +0000 UTC m=+918.359012912" watchObservedRunningTime="2026-01-27 15:22:12.381929635 +0000 UTC m=+918.362538753" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.398816 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-7lq2b"] Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.399353 4772 scope.go:117] "RemoveContainer" containerID="d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4" Jan 27 15:22:12 crc kubenswrapper[4772]: E0127 15:22:12.399794 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4\": container with ID starting with d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4 not found: ID does not exist" containerID="d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.399831 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4"} err="failed to get container status \"d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4\": rpc error: code = NotFound desc = could not find container \"d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4\": container with ID starting with d289b63c1084f4442f5f1a2e1c20d7abe82ebe1562d47d96d12194c6053992a4 not found: ID does not exist" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.410465 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-7lq2b"] Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.510679 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.621471 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxgxq\" (UniqueName: \"kubernetes.io/projected/bdc687da-0857-4070-a27a-90b08ca108c9-kube-api-access-gxgxq\") pod \"bdc687da-0857-4070-a27a-90b08ca108c9\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.621541 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-utilities\") pod \"bdc687da-0857-4070-a27a-90b08ca108c9\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.621579 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-catalog-content\") pod \"bdc687da-0857-4070-a27a-90b08ca108c9\" (UID: \"bdc687da-0857-4070-a27a-90b08ca108c9\") " Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.622351 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-utilities" (OuterVolumeSpecName: "utilities") pod "bdc687da-0857-4070-a27a-90b08ca108c9" (UID: "bdc687da-0857-4070-a27a-90b08ca108c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.625037 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc687da-0857-4070-a27a-90b08ca108c9-kube-api-access-gxgxq" (OuterVolumeSpecName: "kube-api-access-gxgxq") pod "bdc687da-0857-4070-a27a-90b08ca108c9" (UID: "bdc687da-0857-4070-a27a-90b08ca108c9"). InnerVolumeSpecName "kube-api-access-gxgxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.666041 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdc687da-0857-4070-a27a-90b08ca108c9" (UID: "bdc687da-0857-4070-a27a-90b08ca108c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.677410 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13489498-3c32-4ef1-baf5-99f6907d07e4" path="/var/lib/kubelet/pods/13489498-3c32-4ef1-baf5-99f6907d07e4/volumes" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.722736 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.722760 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxgxq\" (UniqueName: \"kubernetes.io/projected/bdc687da-0857-4070-a27a-90b08ca108c9-kube-api-access-gxgxq\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:12 crc kubenswrapper[4772]: I0127 15:22:12.722772 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc687da-0857-4070-a27a-90b08ca108c9-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.365540 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pm7m" event={"ID":"bdc687da-0857-4070-a27a-90b08ca108c9","Type":"ContainerDied","Data":"41d3f481a49e568f6c8ae964bb8f25577c85d9700b3c5df1b8ce8fff898b4f29"} Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.365612 4772 scope.go:117] "RemoveContainer" containerID="5b7851c728141b28bc16da228e27af7b01b686d419421b07643b0ccbbf7acd5a" Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.365684 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pm7m" Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.396210 4772 scope.go:117] "RemoveContainer" containerID="6a60cf692ab3a3eae06e2802f1931e40f49d38619fda7b12604a36a8f58a1dbe" Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.404430 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pm7m"] Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.410467 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pm7m"] Jan 27 15:22:13 crc kubenswrapper[4772]: I0127 15:22:13.420867 4772 scope.go:117] "RemoveContainer" containerID="94686f897895f9959663a28bfa07c2aaa2acb444714ed141333b20d65eed7fc2" Jan 27 15:22:14 crc kubenswrapper[4772]: I0127 15:22:14.677623 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" path="/var/lib/kubelet/pods/bdc687da-0857-4070-a27a-90b08ca108c9/volumes" Jan 27 15:22:20 crc kubenswrapper[4772]: I0127 15:22:20.844443 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:20 crc kubenswrapper[4772]: I0127 15:22:20.845021 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:20 crc kubenswrapper[4772]: I0127 15:22:20.881925 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:21 crc kubenswrapper[4772]: I0127 15:22:21.462600 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-vs9rk" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.952273 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4"] Jan 27 15:22:22 crc kubenswrapper[4772]: E0127 15:22:22.952967 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="registry-server" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.952982 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="registry-server" Jan 27 15:22:22 crc kubenswrapper[4772]: E0127 15:22:22.952996 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="extract-content" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.953004 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="extract-content" Jan 27 15:22:22 crc kubenswrapper[4772]: E0127 15:22:22.953030 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13489498-3c32-4ef1-baf5-99f6907d07e4" containerName="registry-server" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.953038 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="13489498-3c32-4ef1-baf5-99f6907d07e4" containerName="registry-server" Jan 27 15:22:22 crc kubenswrapper[4772]: E0127 15:22:22.953047 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="extract-utilities" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.953053 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="extract-utilities" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.953197 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdc687da-0857-4070-a27a-90b08ca108c9" containerName="registry-server" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.953212 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="13489498-3c32-4ef1-baf5-99f6907d07e4" containerName="registry-server" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.953965 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.957028 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-nwbtg" Jan 27 15:22:22 crc kubenswrapper[4772]: I0127 15:22:22.967771 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4"] Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.062541 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff7tn\" (UniqueName: \"kubernetes.io/projected/0f29ea34-f593-4806-b5f6-2f9976c46a12-kube-api-access-ff7tn\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.062581 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-util\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.062611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-bundle\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.163683 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-util\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.163759 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-bundle\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.163877 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff7tn\" (UniqueName: \"kubernetes.io/projected/0f29ea34-f593-4806-b5f6-2f9976c46a12-kube-api-access-ff7tn\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.164410 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-util\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.164412 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-bundle\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.183125 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff7tn\" (UniqueName: \"kubernetes.io/projected/0f29ea34-f593-4806-b5f6-2f9976c46a12-kube-api-access-ff7tn\") pod \"6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.269229 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:23 crc kubenswrapper[4772]: I0127 15:22:23.565707 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4"] Jan 27 15:22:24 crc kubenswrapper[4772]: I0127 15:22:24.447690 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerID="3eed0491dec9fe3e452816050dcb0d79415f19c85bbaf20cf8830f09120d7753" exitCode=0 Jan 27 15:22:24 crc kubenswrapper[4772]: I0127 15:22:24.447744 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" event={"ID":"0f29ea34-f593-4806-b5f6-2f9976c46a12","Type":"ContainerDied","Data":"3eed0491dec9fe3e452816050dcb0d79415f19c85bbaf20cf8830f09120d7753"} Jan 27 15:22:24 crc kubenswrapper[4772]: I0127 15:22:24.447788 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" event={"ID":"0f29ea34-f593-4806-b5f6-2f9976c46a12","Type":"ContainerStarted","Data":"c56568fce76e8df1f0305837d648767268ec5d470bc759ec6509cc21e8e2cbc4"} Jan 27 15:22:27 crc kubenswrapper[4772]: I0127 15:22:27.478279 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerID="79203eaf770d19a8deea1ab6fa237bbc2f2bf50b4932961ae73b8351e90c5716" exitCode=0 Jan 27 15:22:27 crc kubenswrapper[4772]: I0127 15:22:27.478394 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" event={"ID":"0f29ea34-f593-4806-b5f6-2f9976c46a12","Type":"ContainerDied","Data":"79203eaf770d19a8deea1ab6fa237bbc2f2bf50b4932961ae73b8351e90c5716"} Jan 27 15:22:28 crc kubenswrapper[4772]: I0127 15:22:28.488303 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerID="e5cb8ba5058b870db0b2b7ce8c9625be9b47bfc608e7bdd04c686bcba481c1b6" exitCode=0 Jan 27 15:22:28 crc kubenswrapper[4772]: I0127 15:22:28.488375 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" event={"ID":"0f29ea34-f593-4806-b5f6-2f9976c46a12","Type":"ContainerDied","Data":"e5cb8ba5058b870db0b2b7ce8c9625be9b47bfc608e7bdd04c686bcba481c1b6"} Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.824462 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.877098 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-bundle\") pod \"0f29ea34-f593-4806-b5f6-2f9976c46a12\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.877221 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-util\") pod \"0f29ea34-f593-4806-b5f6-2f9976c46a12\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.877248 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff7tn\" (UniqueName: \"kubernetes.io/projected/0f29ea34-f593-4806-b5f6-2f9976c46a12-kube-api-access-ff7tn\") pod \"0f29ea34-f593-4806-b5f6-2f9976c46a12\" (UID: \"0f29ea34-f593-4806-b5f6-2f9976c46a12\") " Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.877887 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-bundle" (OuterVolumeSpecName: "bundle") pod "0f29ea34-f593-4806-b5f6-2f9976c46a12" (UID: "0f29ea34-f593-4806-b5f6-2f9976c46a12"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.883437 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f29ea34-f593-4806-b5f6-2f9976c46a12-kube-api-access-ff7tn" (OuterVolumeSpecName: "kube-api-access-ff7tn") pod "0f29ea34-f593-4806-b5f6-2f9976c46a12" (UID: "0f29ea34-f593-4806-b5f6-2f9976c46a12"). InnerVolumeSpecName "kube-api-access-ff7tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.887699 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-util" (OuterVolumeSpecName: "util") pod "0f29ea34-f593-4806-b5f6-2f9976c46a12" (UID: "0f29ea34-f593-4806-b5f6-2f9976c46a12"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.979329 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.979365 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f29ea34-f593-4806-b5f6-2f9976c46a12-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:29 crc kubenswrapper[4772]: I0127 15:22:29.979379 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff7tn\" (UniqueName: \"kubernetes.io/projected/0f29ea34-f593-4806-b5f6-2f9976c46a12-kube-api-access-ff7tn\") on node \"crc\" DevicePath \"\"" Jan 27 15:22:30 crc kubenswrapper[4772]: I0127 15:22:30.508399 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" event={"ID":"0f29ea34-f593-4806-b5f6-2f9976c46a12","Type":"ContainerDied","Data":"c56568fce76e8df1f0305837d648767268ec5d470bc759ec6509cc21e8e2cbc4"} Jan 27 15:22:30 crc kubenswrapper[4772]: I0127 15:22:30.508470 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c56568fce76e8df1f0305837d648767268ec5d470bc759ec6509cc21e8e2cbc4" Jan 27 15:22:30 crc kubenswrapper[4772]: I0127 15:22:30.508741 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.294300 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt"] Jan 27 15:22:35 crc kubenswrapper[4772]: E0127 15:22:35.295136 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="util" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.295151 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="util" Jan 27 15:22:35 crc kubenswrapper[4772]: E0127 15:22:35.295182 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="pull" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.295190 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="pull" Jan 27 15:22:35 crc kubenswrapper[4772]: E0127 15:22:35.295204 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="extract" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.295213 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="extract" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.295356 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f29ea34-f593-4806-b5f6-2f9976c46a12" containerName="extract" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.295860 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.298544 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rhmmw" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.326112 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt"] Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.361512 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4nv\" (UniqueName: \"kubernetes.io/projected/939a692e-65d1-4be8-b78a-22ae83072d51-kube-api-access-2z4nv\") pod \"openstack-operator-controller-init-6fb647f7d4-gkjgt\" (UID: \"939a692e-65d1-4be8-b78a-22ae83072d51\") " pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.462553 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4nv\" (UniqueName: \"kubernetes.io/projected/939a692e-65d1-4be8-b78a-22ae83072d51-kube-api-access-2z4nv\") pod \"openstack-operator-controller-init-6fb647f7d4-gkjgt\" (UID: \"939a692e-65d1-4be8-b78a-22ae83072d51\") " pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.485409 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4nv\" (UniqueName: \"kubernetes.io/projected/939a692e-65d1-4be8-b78a-22ae83072d51-kube-api-access-2z4nv\") pod \"openstack-operator-controller-init-6fb647f7d4-gkjgt\" (UID: \"939a692e-65d1-4be8-b78a-22ae83072d51\") " pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:35 crc kubenswrapper[4772]: I0127 15:22:35.614593 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:36 crc kubenswrapper[4772]: I0127 15:22:36.152912 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt"] Jan 27 15:22:36 crc kubenswrapper[4772]: I0127 15:22:36.556809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" event={"ID":"939a692e-65d1-4be8-b78a-22ae83072d51","Type":"ContainerStarted","Data":"285430d1d2da8bf5c3321cad317f7ec39aefceedaed1c75e38d18002294544e1"} Jan 27 15:22:40 crc kubenswrapper[4772]: I0127 15:22:40.581769 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" event={"ID":"939a692e-65d1-4be8-b78a-22ae83072d51","Type":"ContainerStarted","Data":"0bb32ca286b4bb63a4601ea17d681fa5e57fc6fd9be09d28d3803335c16b8c89"} Jan 27 15:22:40 crc kubenswrapper[4772]: I0127 15:22:40.582285 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:40 crc kubenswrapper[4772]: I0127 15:22:40.606238 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" podStartSLOduration=2.124067436 podStartE2EDuration="5.606223645s" podCreationTimestamp="2026-01-27 15:22:35 +0000 UTC" firstStartedPulling="2026-01-27 15:22:36.158154182 +0000 UTC m=+942.138763280" lastFinishedPulling="2026-01-27 15:22:39.640310391 +0000 UTC m=+945.620919489" observedRunningTime="2026-01-27 15:22:40.602734035 +0000 UTC m=+946.583343163" watchObservedRunningTime="2026-01-27 15:22:40.606223645 +0000 UTC m=+946.586832753" Jan 27 15:22:42 crc kubenswrapper[4772]: I0127 15:22:42.059179 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:22:42 crc kubenswrapper[4772]: I0127 15:22:42.059550 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:22:45 crc kubenswrapper[4772]: I0127 15:22:45.617996 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6fb647f7d4-gkjgt" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.016562 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wbmd9"] Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.018189 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.030954 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-catalog-content\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.031037 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-utilities\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.031089 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5fc\" (UniqueName: \"kubernetes.io/projected/8e682da4-be9e-4318-8f65-cd879f9a826a-kube-api-access-tv5fc\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.036794 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbmd9"] Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.132722 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5fc\" (UniqueName: \"kubernetes.io/projected/8e682da4-be9e-4318-8f65-cd879f9a826a-kube-api-access-tv5fc\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.132786 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-catalog-content\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.132842 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-utilities\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.133317 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-utilities\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.133407 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-catalog-content\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.163204 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5fc\" (UniqueName: \"kubernetes.io/projected/8e682da4-be9e-4318-8f65-cd879f9a826a-kube-api-access-tv5fc\") pod \"certified-operators-wbmd9\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.381511 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.815177 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wbmd9"] Jan 27 15:22:54 crc kubenswrapper[4772]: I0127 15:22:54.872084 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerStarted","Data":"ae5e43e320906455ee157d2ed44f17f1214d92e37e6149b34c7998975892c8dd"} Jan 27 15:22:56 crc kubenswrapper[4772]: I0127 15:22:56.893778 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerID="2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643" exitCode=0 Jan 27 15:22:56 crc kubenswrapper[4772]: I0127 15:22:56.893868 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerDied","Data":"2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643"} Jan 27 15:22:59 crc kubenswrapper[4772]: I0127 15:22:59.920534 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerStarted","Data":"bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b"} Jan 27 15:23:00 crc kubenswrapper[4772]: I0127 15:23:00.929004 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerID="bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b" exitCode=0 Jan 27 15:23:00 crc kubenswrapper[4772]: I0127 15:23:00.929063 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerDied","Data":"bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b"} Jan 27 15:23:01 crc kubenswrapper[4772]: I0127 15:23:01.939253 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerStarted","Data":"370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb"} Jan 27 15:23:01 crc kubenswrapper[4772]: I0127 15:23:01.962406 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wbmd9" podStartSLOduration=5.380610437 podStartE2EDuration="8.96238828s" podCreationTimestamp="2026-01-27 15:22:53 +0000 UTC" firstStartedPulling="2026-01-27 15:22:57.903635592 +0000 UTC m=+963.884244690" lastFinishedPulling="2026-01-27 15:23:01.485413435 +0000 UTC m=+967.466022533" observedRunningTime="2026-01-27 15:23:01.956535432 +0000 UTC m=+967.937144530" watchObservedRunningTime="2026-01-27 15:23:01.96238828 +0000 UTC m=+967.942997378" Jan 27 15:23:04 crc kubenswrapper[4772]: I0127 15:23:04.382696 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:23:04 crc kubenswrapper[4772]: I0127 15:23:04.383475 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:23:04 crc kubenswrapper[4772]: I0127 15:23:04.436829 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:23:12 crc kubenswrapper[4772]: I0127 15:23:12.059031 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:23:12 crc kubenswrapper[4772]: I0127 15:23:12.059799 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:23:12 crc kubenswrapper[4772]: I0127 15:23:12.059850 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:23:12 crc kubenswrapper[4772]: I0127 15:23:12.060566 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8213e4fa74445d3800c2dbcb45efc3fb34a6f40c3d5ed5845b811a51d3d8497"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:23:12 crc kubenswrapper[4772]: I0127 15:23:12.060636 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://c8213e4fa74445d3800c2dbcb45efc3fb34a6f40c3d5ed5845b811a51d3d8497" gracePeriod=600 Jan 27 15:23:13 crc kubenswrapper[4772]: I0127 15:23:13.010444 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="c8213e4fa74445d3800c2dbcb45efc3fb34a6f40c3d5ed5845b811a51d3d8497" exitCode=0 Jan 27 15:23:13 crc kubenswrapper[4772]: I0127 15:23:13.010492 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"c8213e4fa74445d3800c2dbcb45efc3fb34a6f40c3d5ed5845b811a51d3d8497"} Jan 27 15:23:13 crc kubenswrapper[4772]: I0127 15:23:13.010800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"ed9bc8d4920540552bc96f7af996996e69c893224418d74c897e7298ed107163"} Jan 27 15:23:13 crc kubenswrapper[4772]: I0127 15:23:13.010823 4772 scope.go:117] "RemoveContainer" containerID="60c798dfb542a875b90e857bf6f54352abce005f4bc0c5fd246c1b5d0903e3f3" Jan 27 15:23:14 crc kubenswrapper[4772]: I0127 15:23:14.443666 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:23:14 crc kubenswrapper[4772]: I0127 15:23:14.502573 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbmd9"] Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.026051 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wbmd9" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="registry-server" containerID="cri-o://370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb" gracePeriod=2 Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.375030 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.550741 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-catalog-content\") pod \"8e682da4-be9e-4318-8f65-cd879f9a826a\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.550855 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-utilities\") pod \"8e682da4-be9e-4318-8f65-cd879f9a826a\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.550897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv5fc\" (UniqueName: \"kubernetes.io/projected/8e682da4-be9e-4318-8f65-cd879f9a826a-kube-api-access-tv5fc\") pod \"8e682da4-be9e-4318-8f65-cd879f9a826a\" (UID: \"8e682da4-be9e-4318-8f65-cd879f9a826a\") " Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.551808 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-utilities" (OuterVolumeSpecName: "utilities") pod "8e682da4-be9e-4318-8f65-cd879f9a826a" (UID: "8e682da4-be9e-4318-8f65-cd879f9a826a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.556459 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e682da4-be9e-4318-8f65-cd879f9a826a-kube-api-access-tv5fc" (OuterVolumeSpecName: "kube-api-access-tv5fc") pod "8e682da4-be9e-4318-8f65-cd879f9a826a" (UID: "8e682da4-be9e-4318-8f65-cd879f9a826a"). InnerVolumeSpecName "kube-api-access-tv5fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.602858 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e682da4-be9e-4318-8f65-cd879f9a826a" (UID: "8e682da4-be9e-4318-8f65-cd879f9a826a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.652468 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.652504 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv5fc\" (UniqueName: \"kubernetes.io/projected/8e682da4-be9e-4318-8f65-cd879f9a826a-kube-api-access-tv5fc\") on node \"crc\" DevicePath \"\"" Jan 27 15:23:15 crc kubenswrapper[4772]: I0127 15:23:15.652515 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e682da4-be9e-4318-8f65-cd879f9a826a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.043437 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerID="370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb" exitCode=0 Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.043489 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wbmd9" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.043487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerDied","Data":"370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb"} Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.043598 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wbmd9" event={"ID":"8e682da4-be9e-4318-8f65-cd879f9a826a","Type":"ContainerDied","Data":"ae5e43e320906455ee157d2ed44f17f1214d92e37e6149b34c7998975892c8dd"} Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.043620 4772 scope.go:117] "RemoveContainer" containerID="370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.058815 4772 scope.go:117] "RemoveContainer" containerID="bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.078571 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wbmd9"] Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.082987 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wbmd9"] Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.104503 4772 scope.go:117] "RemoveContainer" containerID="2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.125907 4772 scope.go:117] "RemoveContainer" containerID="370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb" Jan 27 15:23:16 crc kubenswrapper[4772]: E0127 15:23:16.126345 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb\": container with ID starting with 370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb not found: ID does not exist" containerID="370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.126387 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb"} err="failed to get container status \"370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb\": rpc error: code = NotFound desc = could not find container \"370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb\": container with ID starting with 370eac75d4ce32b16b99194d5e4a9c1cd98d3301cc51e5eca10f39a5246044cb not found: ID does not exist" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.126415 4772 scope.go:117] "RemoveContainer" containerID="bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b" Jan 27 15:23:16 crc kubenswrapper[4772]: E0127 15:23:16.126795 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b\": container with ID starting with bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b not found: ID does not exist" containerID="bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.126815 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b"} err="failed to get container status \"bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b\": rpc error: code = NotFound desc = could not find container \"bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b\": container with ID starting with bee09a387c7651d30762718fdf9481148ff7173f222fd6351786d33b9f772e6b not found: ID does not exist" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.126827 4772 scope.go:117] "RemoveContainer" containerID="2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643" Jan 27 15:23:16 crc kubenswrapper[4772]: E0127 15:23:16.128509 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643\": container with ID starting with 2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643 not found: ID does not exist" containerID="2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.128546 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643"} err="failed to get container status \"2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643\": rpc error: code = NotFound desc = could not find container \"2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643\": container with ID starting with 2f8832235d7bf2dd65342391017e414aa8e6b691bf56ffa84bb5000edb279643 not found: ID does not exist" Jan 27 15:23:16 crc kubenswrapper[4772]: I0127 15:23:16.670229 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" path="/var/lib/kubelet/pods/8e682da4-be9e-4318-8f65-cd879f9a826a/volumes" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.710848 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9"] Jan 27 15:23:21 crc kubenswrapper[4772]: E0127 15:23:21.711752 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="extract-content" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.711774 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="extract-content" Jan 27 15:23:21 crc kubenswrapper[4772]: E0127 15:23:21.711812 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="extract-utilities" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.711823 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="extract-utilities" Jan 27 15:23:21 crc kubenswrapper[4772]: E0127 15:23:21.711848 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="registry-server" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.711856 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="registry-server" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.712005 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e682da4-be9e-4318-8f65-cd879f9a826a" containerName="registry-server" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.712580 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.716770 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lmxhh" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.718093 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.719125 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.721441 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qm6dd" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.725794 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.734294 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.735004 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.737084 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-7pnmk" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.746714 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.747601 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.760419 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xtchh" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.767351 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.767396 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.777679 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.843207 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.844133 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.849411 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ffgpg" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.872136 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.884081 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xv2c\" (UniqueName: \"kubernetes.io/projected/d395f105-54f0-4497-a119-57802be313a3-kube-api-access-4xv2c\") pod \"designate-operator-controller-manager-77554cdc5c-tkr6j\" (UID: \"d395f105-54f0-4497-a119-57802be313a3\") " pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.884124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7hp\" (UniqueName: \"kubernetes.io/projected/4c63a702-50b9-42f3-858e-7e27da0a8d8f-kube-api-access-9b7hp\") pod \"heat-operator-controller-manager-575ffb885b-mtd9d\" (UID: \"4c63a702-50b9-42f3-858e-7e27da0a8d8f\") " pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.884145 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snjkj\" (UniqueName: \"kubernetes.io/projected/fb300814-fca7-4419-ac6e-c08b33edd4be-kube-api-access-snjkj\") pod \"glance-operator-controller-manager-67dd55ff59-hgscb\" (UID: \"fb300814-fca7-4419-ac6e-c08b33edd4be\") " pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.884203 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brwq5\" (UniqueName: \"kubernetes.io/projected/674f4da6-f50d-4bab-808d-56ab3b9e2cb4-kube-api-access-brwq5\") pod \"barbican-operator-controller-manager-65ff799cfd-t42n9\" (UID: \"674f4da6-f50d-4bab-808d-56ab3b9e2cb4\") " pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.884229 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4fn8\" (UniqueName: \"kubernetes.io/projected/fde95124-892b-411a-ba05-fa70927c8838-kube-api-access-z4fn8\") pod \"cinder-operator-controller-manager-655bf9cfbb-cgh7j\" (UID: \"fde95124-892b-411a-ba05-fa70927c8838\") " pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.897232 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.898025 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.908833 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-97xd2" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.943302 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.948240 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.949047 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.954526 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-5ldvt" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.958226 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr"] Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.959032 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.960806 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.961288 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-kdbs8" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.984913 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4fn8\" (UniqueName: \"kubernetes.io/projected/fde95124-892b-411a-ba05-fa70927c8838-kube-api-access-z4fn8\") pod \"cinder-operator-controller-manager-655bf9cfbb-cgh7j\" (UID: \"fde95124-892b-411a-ba05-fa70927c8838\") " pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.985008 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xv2c\" (UniqueName: \"kubernetes.io/projected/d395f105-54f0-4497-a119-57802be313a3-kube-api-access-4xv2c\") pod \"designate-operator-controller-manager-77554cdc5c-tkr6j\" (UID: \"d395f105-54f0-4497-a119-57802be313a3\") " pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.985042 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b7hp\" (UniqueName: \"kubernetes.io/projected/4c63a702-50b9-42f3-858e-7e27da0a8d8f-kube-api-access-9b7hp\") pod \"heat-operator-controller-manager-575ffb885b-mtd9d\" (UID: \"4c63a702-50b9-42f3-858e-7e27da0a8d8f\") " pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.985068 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snjkj\" (UniqueName: \"kubernetes.io/projected/fb300814-fca7-4419-ac6e-c08b33edd4be-kube-api-access-snjkj\") pod \"glance-operator-controller-manager-67dd55ff59-hgscb\" (UID: \"fb300814-fca7-4419-ac6e-c08b33edd4be\") " pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.985125 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brwq5\" (UniqueName: \"kubernetes.io/projected/674f4da6-f50d-4bab-808d-56ab3b9e2cb4-kube-api-access-brwq5\") pod \"barbican-operator-controller-manager-65ff799cfd-t42n9\" (UID: \"674f4da6-f50d-4bab-808d-56ab3b9e2cb4\") " pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:21 crc kubenswrapper[4772]: I0127 15:23:21.986600 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.003236 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.016061 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4fn8\" (UniqueName: \"kubernetes.io/projected/fde95124-892b-411a-ba05-fa70927c8838-kube-api-access-z4fn8\") pod \"cinder-operator-controller-manager-655bf9cfbb-cgh7j\" (UID: \"fde95124-892b-411a-ba05-fa70927c8838\") " pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.024401 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snjkj\" (UniqueName: \"kubernetes.io/projected/fb300814-fca7-4419-ac6e-c08b33edd4be-kube-api-access-snjkj\") pod \"glance-operator-controller-manager-67dd55ff59-hgscb\" (UID: \"fb300814-fca7-4419-ac6e-c08b33edd4be\") " pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.029461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xv2c\" (UniqueName: \"kubernetes.io/projected/d395f105-54f0-4497-a119-57802be313a3-kube-api-access-4xv2c\") pod \"designate-operator-controller-manager-77554cdc5c-tkr6j\" (UID: \"d395f105-54f0-4497-a119-57802be313a3\") " pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.039777 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b7hp\" (UniqueName: \"kubernetes.io/projected/4c63a702-50b9-42f3-858e-7e27da0a8d8f-kube-api-access-9b7hp\") pod \"heat-operator-controller-manager-575ffb885b-mtd9d\" (UID: \"4c63a702-50b9-42f3-858e-7e27da0a8d8f\") " pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.046868 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brwq5\" (UniqueName: \"kubernetes.io/projected/674f4da6-f50d-4bab-808d-56ab3b9e2cb4-kube-api-access-brwq5\") pod \"barbican-operator-controller-manager-65ff799cfd-t42n9\" (UID: \"674f4da6-f50d-4bab-808d-56ab3b9e2cb4\") " pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.075231 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.076026 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.079463 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4mmkm" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.085120 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.085911 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088187 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088526 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qbl\" (UniqueName: \"kubernetes.io/projected/27ec5082-c170-465b-b3a3-1f27a545fd71-kube-api-access-x2qbl\") pod \"manila-operator-controller-manager-849fcfbb6b-tvrx9\" (UID: \"27ec5082-c170-465b-b3a3-1f27a545fd71\") " pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088596 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnt7\" (UniqueName: \"kubernetes.io/projected/783d8159-e67a-4796-83d8-4eff27d79505-kube-api-access-dnnt7\") pod \"keystone-operator-controller-manager-55f684fd56-wzjrz\" (UID: \"783d8159-e67a-4796-83d8-4eff27d79505\") " pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088626 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmq4\" (UniqueName: \"kubernetes.io/projected/e85aef3a-e235-473c-94cc-1f6237798b3e-kube-api-access-fsmq4\") pod \"horizon-operator-controller-manager-77d5c5b54f-jcb4p\" (UID: \"e85aef3a-e235-473c-94cc-1f6237798b3e\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088643 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74pn\" (UniqueName: \"kubernetes.io/projected/e7465bd0-3b6e-4199-9ee6-28b512198847-kube-api-access-z74pn\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088662 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.088689 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs77k\" (UniqueName: \"kubernetes.io/projected/2df85221-33ed-49be-949c-516810279e4d-kube-api-access-zs77k\") pod \"ironic-operator-controller-manager-768b776ffb-sxbjn\" (UID: \"2df85221-33ed-49be-949c-516810279e4d\") " pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.101310 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.101423 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hqfvl" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.116954 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.126346 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.131734 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.151011 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.186769 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.191602 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.212574 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.221091 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qbl\" (UniqueName: \"kubernetes.io/projected/27ec5082-c170-465b-b3a3-1f27a545fd71-kube-api-access-x2qbl\") pod \"manila-operator-controller-manager-849fcfbb6b-tvrx9\" (UID: \"27ec5082-c170-465b-b3a3-1f27a545fd71\") " pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.224057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnt7\" (UniqueName: \"kubernetes.io/projected/783d8159-e67a-4796-83d8-4eff27d79505-kube-api-access-dnnt7\") pod \"keystone-operator-controller-manager-55f684fd56-wzjrz\" (UID: \"783d8159-e67a-4796-83d8-4eff27d79505\") " pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.224150 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmq4\" (UniqueName: \"kubernetes.io/projected/e85aef3a-e235-473c-94cc-1f6237798b3e-kube-api-access-fsmq4\") pod \"horizon-operator-controller-manager-77d5c5b54f-jcb4p\" (UID: \"e85aef3a-e235-473c-94cc-1f6237798b3e\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.224206 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74pn\" (UniqueName: \"kubernetes.io/projected/e7465bd0-3b6e-4199-9ee6-28b512198847-kube-api-access-z74pn\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.224241 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.224283 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs77k\" (UniqueName: \"kubernetes.io/projected/2df85221-33ed-49be-949c-516810279e4d-kube-api-access-zs77k\") pod \"ironic-operator-controller-manager-768b776ffb-sxbjn\" (UID: \"2df85221-33ed-49be-949c-516810279e4d\") " pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.225470 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.225518 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert podName:e7465bd0-3b6e-4199-9ee6-28b512198847 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:22.725502245 +0000 UTC m=+988.706111343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert") pod "infra-operator-controller-manager-7d75bc88d5-t54fr" (UID: "e7465bd0-3b6e-4199-9ee6-28b512198847") : secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.228619 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qxqv9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.242263 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.250810 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.261875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qbl\" (UniqueName: \"kubernetes.io/projected/27ec5082-c170-465b-b3a3-1f27a545fd71-kube-api-access-x2qbl\") pod \"manila-operator-controller-manager-849fcfbb6b-tvrx9\" (UID: \"27ec5082-c170-465b-b3a3-1f27a545fd71\") " pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.282487 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8wz4d" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.283401 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmq4\" (UniqueName: \"kubernetes.io/projected/e85aef3a-e235-473c-94cc-1f6237798b3e-kube-api-access-fsmq4\") pod \"horizon-operator-controller-manager-77d5c5b54f-jcb4p\" (UID: \"e85aef3a-e235-473c-94cc-1f6237798b3e\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.287971 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs77k\" (UniqueName: \"kubernetes.io/projected/2df85221-33ed-49be-949c-516810279e4d-kube-api-access-zs77k\") pod \"ironic-operator-controller-manager-768b776ffb-sxbjn\" (UID: \"2df85221-33ed-49be-949c-516810279e4d\") " pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.298117 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74pn\" (UniqueName: \"kubernetes.io/projected/e7465bd0-3b6e-4199-9ee6-28b512198847-kube-api-access-z74pn\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.298708 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.299641 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.302228 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnt7\" (UniqueName: \"kubernetes.io/projected/783d8159-e67a-4796-83d8-4eff27d79505-kube-api-access-dnnt7\") pod \"keystone-operator-controller-manager-55f684fd56-wzjrz\" (UID: \"783d8159-e67a-4796-83d8-4eff27d79505\") " pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.313457 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.316726 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kh8qn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.324842 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.329654 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kqwk\" (UniqueName: \"kubernetes.io/projected/0a88aa66-b634-44ee-8e5b-bfeacb765e57-kube-api-access-6kqwk\") pod \"neutron-operator-controller-manager-7ffd8d76d4-gcpj4\" (UID: \"0a88aa66-b634-44ee-8e5b-bfeacb765e57\") " pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.343218 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.344069 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.345962 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rlp2h" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.369900 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.395350 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.410714 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.430939 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.433448 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zmgp\" (UniqueName: \"kubernetes.io/projected/80584c24-3c75-4624-802f-e608f640eeaa-kube-api-access-2zmgp\") pod \"octavia-operator-controller-manager-7875d7675-ktfbt\" (UID: \"80584c24-3c75-4624-802f-e608f640eeaa\") " pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.433489 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbb4j\" (UniqueName: \"kubernetes.io/projected/e7fc5297-101a-496e-a7c6-e7296e08a5af-kube-api-access-gbb4j\") pod \"nova-operator-controller-manager-ddcbfd695-6wltn\" (UID: \"e7fc5297-101a-496e-a7c6-e7296e08a5af\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.433524 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kqwk\" (UniqueName: \"kubernetes.io/projected/0a88aa66-b634-44ee-8e5b-bfeacb765e57-kube-api-access-6kqwk\") pod \"neutron-operator-controller-manager-7ffd8d76d4-gcpj4\" (UID: \"0a88aa66-b634-44ee-8e5b-bfeacb765e57\") " pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.433570 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8n6m\" (UniqueName: \"kubernetes.io/projected/b73c175a-e89e-434f-996a-65c1140bb8dd-kube-api-access-t8n6m\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-zhd82\" (UID: \"b73c175a-e89e-434f-996a-65c1140bb8dd\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.440305 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.441138 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.443302 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.443588 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tbw52" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.452701 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.454825 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.470925 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ljmlt" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.484010 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.496706 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.497577 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.504480 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kqwk\" (UniqueName: \"kubernetes.io/projected/0a88aa66-b634-44ee-8e5b-bfeacb765e57-kube-api-access-6kqwk\") pod \"neutron-operator-controller-manager-7ffd8d76d4-gcpj4\" (UID: \"0a88aa66-b634-44ee-8e5b-bfeacb765e57\") " pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.513718 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hf92f" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.518939 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.520796 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.521544 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.528658 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-64w92" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.538372 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.541373 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zmgp\" (UniqueName: \"kubernetes.io/projected/80584c24-3c75-4624-802f-e608f640eeaa-kube-api-access-2zmgp\") pod \"octavia-operator-controller-manager-7875d7675-ktfbt\" (UID: \"80584c24-3c75-4624-802f-e608f640eeaa\") " pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.541411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmnhg\" (UniqueName: \"kubernetes.io/projected/1389813b-42ea-433f-820c-e5b8b41713d7-kube-api-access-gmnhg\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.541443 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzpp\" (UniqueName: \"kubernetes.io/projected/e76712a7-ebf6-4f04-a52c-c8d2bacb87f7-kube-api-access-lmzpp\") pod \"ovn-operator-controller-manager-6f75f45d54-ww79v\" (UID: \"e76712a7-ebf6-4f04-a52c-c8d2bacb87f7\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.541471 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbb4j\" (UniqueName: \"kubernetes.io/projected/e7fc5297-101a-496e-a7c6-e7296e08a5af-kube-api-access-gbb4j\") pod \"nova-operator-controller-manager-ddcbfd695-6wltn\" (UID: \"e7fc5297-101a-496e-a7c6-e7296e08a5af\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.541569 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8n6m\" (UniqueName: \"kubernetes.io/projected/b73c175a-e89e-434f-996a-65c1140bb8dd-kube-api-access-t8n6m\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-zhd82\" (UID: \"b73c175a-e89e-434f-996a-65c1140bb8dd\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.564899 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.564951 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.569721 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.577507 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.578301 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.580340 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8n6m\" (UniqueName: \"kubernetes.io/projected/b73c175a-e89e-434f-996a-65c1140bb8dd-kube-api-access-t8n6m\") pod \"mariadb-operator-controller-manager-6b9fb5fdcb-zhd82\" (UID: \"b73c175a-e89e-434f-996a-65c1140bb8dd\") " pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.580616 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.580613 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-q2hmw" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.599305 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbb4j\" (UniqueName: \"kubernetes.io/projected/e7fc5297-101a-496e-a7c6-e7296e08a5af-kube-api-access-gbb4j\") pod \"nova-operator-controller-manager-ddcbfd695-6wltn\" (UID: \"e7fc5297-101a-496e-a7c6-e7296e08a5af\") " pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.599520 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zmgp\" (UniqueName: \"kubernetes.io/projected/80584c24-3c75-4624-802f-e608f640eeaa-kube-api-access-2zmgp\") pod \"octavia-operator-controller-manager-7875d7675-ktfbt\" (UID: \"80584c24-3c75-4624-802f-e608f640eeaa\") " pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.599585 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.600493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.602078 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-6hhp6" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.612505 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.618253 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.627897 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.629049 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.633272 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-2w5kx" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.642834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmnhg\" (UniqueName: \"kubernetes.io/projected/1389813b-42ea-433f-820c-e5b8b41713d7-kube-api-access-gmnhg\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.642871 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzpp\" (UniqueName: \"kubernetes.io/projected/e76712a7-ebf6-4f04-a52c-c8d2bacb87f7-kube-api-access-lmzpp\") pod \"ovn-operator-controller-manager-6f75f45d54-ww79v\" (UID: \"e76712a7-ebf6-4f04-a52c-c8d2bacb87f7\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.642914 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k4pg\" (UniqueName: \"kubernetes.io/projected/c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8-kube-api-access-6k4pg\") pod \"placement-operator-controller-manager-79d5ccc684-vwnwk\" (UID: \"c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.642943 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qgw\" (UniqueName: \"kubernetes.io/projected/c09741c3-6bae-487a-9b4c-7c9f01d8c5bf-kube-api-access-l7qgw\") pod \"swift-operator-controller-manager-547cbdb99f-l8d48\" (UID: \"c09741c3-6bae-487a-9b4c-7c9f01d8c5bf\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.642978 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.643079 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.643115 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert podName:1389813b-42ea-433f-820c-e5b8b41713d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:23.143102743 +0000 UTC m=+989.123711841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q994c" (UID: "1389813b-42ea-433f-820c-e5b8b41713d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.647907 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.655429 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.656275 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.657128 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.658452 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.659015 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xvq6z" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.663665 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.664374 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzpp\" (UniqueName: \"kubernetes.io/projected/e76712a7-ebf6-4f04-a52c-c8d2bacb87f7-kube-api-access-lmzpp\") pod \"ovn-operator-controller-manager-6f75f45d54-ww79v\" (UID: \"e76712a7-ebf6-4f04-a52c-c8d2bacb87f7\") " pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.664477 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmnhg\" (UniqueName: \"kubernetes.io/projected/1389813b-42ea-433f-820c-e5b8b41713d7-kube-api-access-gmnhg\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.701810 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.713468 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.732583 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.741389 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.753034 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sn68d" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.761889 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxgx\" (UniqueName: \"kubernetes.io/projected/e4a99865-64a7-49e5-bdce-ff929105fc0d-kube-api-access-lrxgx\") pod \"telemetry-operator-controller-manager-799bc87c89-k2l8k\" (UID: \"e4a99865-64a7-49e5-bdce-ff929105fc0d\") " pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.767548 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.770310 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.770925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8vnj\" (UniqueName: \"kubernetes.io/projected/6242683c-24ad-4e22-a7b3-8463e07388c2-kube-api-access-s8vnj\") pod \"test-operator-controller-manager-69797bbcbd-ln7xf\" (UID: \"6242683c-24ad-4e22-a7b3-8463e07388c2\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771090 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297"] Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771116 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k4pg\" (UniqueName: \"kubernetes.io/projected/c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8-kube-api-access-6k4pg\") pod \"placement-operator-controller-manager-79d5ccc684-vwnwk\" (UID: \"c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771574 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qgw\" (UniqueName: \"kubernetes.io/projected/c09741c3-6bae-487a-9b4c-7c9f01d8c5bf-kube-api-access-l7qgw\") pod \"swift-operator-controller-manager-547cbdb99f-l8d48\" (UID: \"c09741c3-6bae-487a-9b4c-7c9f01d8c5bf\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771631 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcm8k\" (UniqueName: \"kubernetes.io/projected/783285f4-2e9d-4af5-b017-32676e7d1b01-kube-api-access-jcm8k\") pod \"watcher-operator-controller-manager-6c9bb4b66c-ws2mh\" (UID: \"783285f4-2e9d-4af5-b017-32676e7d1b01\") " pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771800 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.771856 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phdt\" (UniqueName: \"kubernetes.io/projected/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-kube-api-access-7phdt\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.772352 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.772424 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert podName:e7465bd0-3b6e-4199-9ee6-28b512198847 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:23.772390461 +0000 UTC m=+989.752999559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert") pod "infra-operator-controller-manager-7d75bc88d5-t54fr" (UID: "e7465bd0-3b6e-4199-9ee6-28b512198847") : secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.805785 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k4pg\" (UniqueName: \"kubernetes.io/projected/c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8-kube-api-access-6k4pg\") pod \"placement-operator-controller-manager-79d5ccc684-vwnwk\" (UID: \"c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8\") " pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.806940 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qgw\" (UniqueName: \"kubernetes.io/projected/c09741c3-6bae-487a-9b4c-7c9f01d8c5bf-kube-api-access-l7qgw\") pod \"swift-operator-controller-manager-547cbdb99f-l8d48\" (UID: \"c09741c3-6bae-487a-9b4c-7c9f01d8c5bf\") " pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.815866 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.828393 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9"] Jan 27 15:23:22 crc kubenswrapper[4772]: W0127 15:23:22.852522 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod674f4da6_f50d_4bab_808d_56ab3b9e2cb4.slice/crio-327b15a8084a569369ed015449a1ec6b61f653b1f47c81d392243142feb80680 WatchSource:0}: Error finding container 327b15a8084a569369ed015449a1ec6b61f653b1f47c81d392243142feb80680: Status 404 returned error can't find the container with id 327b15a8084a569369ed015449a1ec6b61f653b1f47c81d392243142feb80680 Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874361 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8vnj\" (UniqueName: \"kubernetes.io/projected/6242683c-24ad-4e22-a7b3-8463e07388c2-kube-api-access-s8vnj\") pod \"test-operator-controller-manager-69797bbcbd-ln7xf\" (UID: \"6242683c-24ad-4e22-a7b3-8463e07388c2\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874443 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874475 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t57ck\" (UniqueName: \"kubernetes.io/projected/abaf1142-1b7c-4987-8a9d-c91e6456c4a5-kube-api-access-t57ck\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h9297\" (UID: \"abaf1142-1b7c-4987-8a9d-c91e6456c4a5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874524 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcm8k\" (UniqueName: \"kubernetes.io/projected/783285f4-2e9d-4af5-b017-32676e7d1b01-kube-api-access-jcm8k\") pod \"watcher-operator-controller-manager-6c9bb4b66c-ws2mh\" (UID: \"783285f4-2e9d-4af5-b017-32676e7d1b01\") " pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phdt\" (UniqueName: \"kubernetes.io/projected/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-kube-api-access-7phdt\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874585 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxgx\" (UniqueName: \"kubernetes.io/projected/e4a99865-64a7-49e5-bdce-ff929105fc0d-kube-api-access-lrxgx\") pod \"telemetry-operator-controller-manager-799bc87c89-k2l8k\" (UID: \"e4a99865-64a7-49e5-bdce-ff929105fc0d\") " pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.874619 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.874764 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.874813 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:23.374797836 +0000 UTC m=+989.355406934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "webhook-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.875197 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: E0127 15:23:22.875221 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:23.375214388 +0000 UTC m=+989.355823486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "metrics-server-cert" not found Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.905106 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcm8k\" (UniqueName: \"kubernetes.io/projected/783285f4-2e9d-4af5-b017-32676e7d1b01-kube-api-access-jcm8k\") pod \"watcher-operator-controller-manager-6c9bb4b66c-ws2mh\" (UID: \"783285f4-2e9d-4af5-b017-32676e7d1b01\") " pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.907684 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxgx\" (UniqueName: \"kubernetes.io/projected/e4a99865-64a7-49e5-bdce-ff929105fc0d-kube-api-access-lrxgx\") pod \"telemetry-operator-controller-manager-799bc87c89-k2l8k\" (UID: \"e4a99865-64a7-49e5-bdce-ff929105fc0d\") " pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.910978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phdt\" (UniqueName: \"kubernetes.io/projected/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-kube-api-access-7phdt\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.915543 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8vnj\" (UniqueName: \"kubernetes.io/projected/6242683c-24ad-4e22-a7b3-8463e07388c2-kube-api-access-s8vnj\") pod \"test-operator-controller-manager-69797bbcbd-ln7xf\" (UID: \"6242683c-24ad-4e22-a7b3-8463e07388c2\") " pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.959273 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.976032 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t57ck\" (UniqueName: \"kubernetes.io/projected/abaf1142-1b7c-4987-8a9d-c91e6456c4a5-kube-api-access-t57ck\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h9297\" (UID: \"abaf1142-1b7c-4987-8a9d-c91e6456c4a5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.985798 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:22 crc kubenswrapper[4772]: I0127 15:23:22.996767 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t57ck\" (UniqueName: \"kubernetes.io/projected/abaf1142-1b7c-4987-8a9d-c91e6456c4a5-kube-api-access-t57ck\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h9297\" (UID: \"abaf1142-1b7c-4987-8a9d-c91e6456c4a5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.021519 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.038092 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.107205 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.122796 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" event={"ID":"674f4da6-f50d-4bab-808d-56ab3b9e2cb4","Type":"ContainerStarted","Data":"327b15a8084a569369ed015449a1ec6b61f653b1f47c81d392243142feb80680"} Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.161265 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.181035 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.181210 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.181258 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert podName:1389813b-42ea-433f-820c-e5b8b41713d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:24.181242117 +0000 UTC m=+990.161851215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q994c" (UID: "1389813b-42ea-433f-820c-e5b8b41713d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.224111 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.236310 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.254861 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.265938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.315940 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd395f105_54f0_4497_a119_57802be313a3.slice/crio-173210d17fd463ed8fef2550b0ce546bd294ef92d7429c1055eeb446216d0cb0 WatchSource:0}: Error finding container 173210d17fd463ed8fef2550b0ce546bd294ef92d7429c1055eeb446216d0cb0: Status 404 returned error can't find the container with id 173210d17fd463ed8fef2550b0ce546bd294ef92d7429c1055eeb446216d0cb0 Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.384714 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.384859 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.385014 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.385071 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:24.385054097 +0000 UTC m=+990.365663195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "webhook-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.385135 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.385161 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:24.38515289 +0000 UTC m=+990.365761998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "metrics-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.404100 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.423926 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df85221_33ed_49be_949c_516810279e4d.slice/crio-923cea042cc049643b9f18ac2932392946acb25bc60a1395d9554055be88e0fd WatchSource:0}: Error finding container 923cea042cc049643b9f18ac2932392946acb25bc60a1395d9554055be88e0fd: Status 404 returned error can't find the container with id 923cea042cc049643b9f18ac2932392946acb25bc60a1395d9554055be88e0fd Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.441654 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.457367 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.469112 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.493985 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.501104 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode85aef3a_e235_473c_94cc_1f6237798b3e.slice/crio-56bf3334bca8a317a9c02e89bea12c6ad7b7a61a941db3290a4791db7563a506 WatchSource:0}: Error finding container 56bf3334bca8a317a9c02e89bea12c6ad7b7a61a941db3290a4791db7563a506: Status 404 returned error can't find the container with id 56bf3334bca8a317a9c02e89bea12c6ad7b7a61a941db3290a4791db7563a506 Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.592028 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.606714 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.626464 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.630990 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a88aa66_b634_44ee_8e5b_bfeacb765e57.slice/crio-4bb52078231b82897e96da9e073eb926d06bf3f15389686c1533aa0597ef25b0 WatchSource:0}: Error finding container 4bb52078231b82897e96da9e073eb926d06bf3f15389686c1533aa0597ef25b0: Status 404 returned error can't find the container with id 4bb52078231b82897e96da9e073eb926d06bf3f15389686c1533aa0597ef25b0 Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.631594 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80584c24_3c75_4624_802f_e608f640eeaa.slice/crio-75a5b9256fe3030df211e53616d6aaacfa661c5623aee3562a86d05a5b0c0ee1 WatchSource:0}: Error finding container 75a5b9256fe3030df211e53616d6aaacfa661c5623aee3562a86d05a5b0c0ee1: Status 404 returned error can't find the container with id 75a5b9256fe3030df211e53616d6aaacfa661c5623aee3562a86d05a5b0c0ee1 Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.635547 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.641345 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7fc5297_101a_496e_a7c6_e7296e08a5af.slice/crio-f223d3b2deea88b7b47d9c6cfd2561cd263d36b3f2656a548273fd94975efdad WatchSource:0}: Error finding container f223d3b2deea88b7b47d9c6cfd2561cd263d36b3f2656a548273fd94975efdad: Status 404 returned error can't find the container with id f223d3b2deea88b7b47d9c6cfd2561cd263d36b3f2656a548273fd94975efdad Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.641681 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb73c175a_e89e_434f_996a_65c1140bb8dd.slice/crio-217e47d1ea1e577f9f3d26d127deeb429193e347283ee4598ecaa45a8a6c4104 WatchSource:0}: Error finding container 217e47d1ea1e577f9f3d26d127deeb429193e347283ee4598ecaa45a8a6c4104: Status 404 returned error can't find the container with id 217e47d1ea1e577f9f3d26d127deeb429193e347283ee4598ecaa45a8a6c4104 Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.743379 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk"] Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.754190 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6k4pg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-79d5ccc684-vwnwk_openstack-operators(c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.755398 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" podUID="c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8" Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.755865 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode76712a7_ebf6_4f04_a52c_c8d2bacb87f7.slice/crio-f752e4541b769e913ac3b0d8af7e559c30963fa17cfa0958a239f359bea1fb22 WatchSource:0}: Error finding container f752e4541b769e913ac3b0d8af7e559c30963fa17cfa0958a239f359bea1fb22: Status 404 returned error can't find the container with id f752e4541b769e913ac3b0d8af7e559c30963fa17cfa0958a239f359bea1fb22 Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.758261 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48"] Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.761536 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:fa46fc14710961e6b4a76a3522dca3aa3cfa71436c7cf7ade533d3712822f327,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lmzpp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6f75f45d54-ww79v_openstack-operators(e76712a7-ebf6-4f04-a52c-c8d2bacb87f7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.762672 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" podUID="e76712a7-ebf6-4f04-a52c-c8d2bacb87f7" Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.762949 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc09741c3_6bae_487a_9b4c_7c9f01d8c5bf.slice/crio-ccbef40d188785ee8381011d390820b0ef2381f4740dc83e4e0fcb6544f9a8fb WatchSource:0}: Error finding container ccbef40d188785ee8381011d390820b0ef2381f4740dc83e4e0fcb6544f9a8fb: Status 404 returned error can't find the container with id ccbef40d188785ee8381011d390820b0ef2381f4740dc83e4e0fcb6544f9a8fb Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.763248 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v"] Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.768527 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l7qgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-547cbdb99f-l8d48_openstack-operators(c09741c3-6bae-487a-9b4c-7c9f01d8c5bf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.769727 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" podUID="c09741c3-6bae-487a-9b4c-7c9f01d8c5bf" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.791578 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.791812 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.791873 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert podName:e7465bd0-3b6e-4199-9ee6-28b512198847 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:25.791854795 +0000 UTC m=+991.772463893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert") pod "infra-operator-controller-manager-7d75bc88d5-t54fr" (UID: "e7465bd0-3b6e-4199-9ee6-28b512198847") : secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.813853 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.825524 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783285f4_2e9d_4af5_b017_32676e7d1b01.slice/crio-ed6acc5f50bf0655622492cac0b3f0b4b220be2099e1a5d911dcf85528df3745 WatchSource:0}: Error finding container ed6acc5f50bf0655622492cac0b3f0b4b220be2099e1a5d911dcf85528df3745: Status 404 returned error can't find the container with id ed6acc5f50bf0655622492cac0b3f0b4b220be2099e1a5d911dcf85528df3745 Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.828188 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/watcher-operator@sha256:162fb83ed76cbf5d44ba057fbeee02a9182fdf02346afadb3e16b2e3627e1940,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jcm8k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c9bb4b66c-ws2mh_openstack-operators(783285f4-2e9d-4af5-b017-32676e7d1b01): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.830318 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" podUID="783285f4-2e9d-4af5-b017-32676e7d1b01" Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.831460 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.834946 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabaf1142_1b7c_4987_8a9d_c91e6456c4a5.slice/crio-5d0e7b142093081addf02d817aa0ec9a28effd35fcb9f32597195a91e4afc707 WatchSource:0}: Error finding container 5d0e7b142093081addf02d817aa0ec9a28effd35fcb9f32597195a91e4afc707: Status 404 returned error can't find the container with id 5d0e7b142093081addf02d817aa0ec9a28effd35fcb9f32597195a91e4afc707 Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.851193 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf"] Jan 27 15:23:23 crc kubenswrapper[4772]: I0127 15:23:23.859556 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k"] Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.865332 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6242683c_24ad_4e22_a7b3_8463e07388c2.slice/crio-f2ece18acf3e1a9e84f9affb02aa64941da9edaa5466989d492565763b3e141f WatchSource:0}: Error finding container f2ece18acf3e1a9e84f9affb02aa64941da9edaa5466989d492565763b3e141f: Status 404 returned error can't find the container with id f2ece18acf3e1a9e84f9affb02aa64941da9edaa5466989d492565763b3e141f Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.867570 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s8vnj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-69797bbcbd-ln7xf_openstack-operators(6242683c-24ad-4e22-a7b3-8463e07388c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 15:23:23 crc kubenswrapper[4772]: W0127 15:23:23.868378 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a99865_64a7_49e5_bdce_ff929105fc0d.slice/crio-d0b20d02410e5c548ba2aadd079cbe4efef894fa30ffeedf5ad011a05a057ca7 WatchSource:0}: Error finding container d0b20d02410e5c548ba2aadd079cbe4efef894fa30ffeedf5ad011a05a057ca7: Status 404 returned error can't find the container with id d0b20d02410e5c548ba2aadd079cbe4efef894fa30ffeedf5ad011a05a057ca7 Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.869087 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" podUID="6242683c-24ad-4e22-a7b3-8463e07388c2" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.870654 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/telemetry-operator@sha256:1f1fea3b7df89b81756eab8e6f4c9bed01ab7e949a6ce2d7692c260f41dfbc20,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lrxgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-799bc87c89-k2l8k_openstack-operators(e4a99865-64a7-49e5-bdce-ff929105fc0d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 15:23:23 crc kubenswrapper[4772]: E0127 15:23:23.872121 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" podUID="e4a99865-64a7-49e5-bdce-ff929105fc0d" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.128330 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" event={"ID":"e4a99865-64a7-49e5-bdce-ff929105fc0d","Type":"ContainerStarted","Data":"d0b20d02410e5c548ba2aadd079cbe4efef894fa30ffeedf5ad011a05a057ca7"} Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.129836 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/telemetry-operator@sha256:1f1fea3b7df89b81756eab8e6f4c9bed01ab7e949a6ce2d7692c260f41dfbc20\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" podUID="e4a99865-64a7-49e5-bdce-ff929105fc0d" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.130433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" event={"ID":"27ec5082-c170-465b-b3a3-1f27a545fd71","Type":"ContainerStarted","Data":"2f5dcda5d379d1885b362cc12de1a01612d50caf320579c3c8a372c1ff596cec"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.131097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" event={"ID":"d395f105-54f0-4497-a119-57802be313a3","Type":"ContainerStarted","Data":"173210d17fd463ed8fef2550b0ce546bd294ef92d7429c1055eeb446216d0cb0"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.132934 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" event={"ID":"e85aef3a-e235-473c-94cc-1f6237798b3e","Type":"ContainerStarted","Data":"56bf3334bca8a317a9c02e89bea12c6ad7b7a61a941db3290a4791db7563a506"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.133712 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" event={"ID":"e76712a7-ebf6-4f04-a52c-c8d2bacb87f7","Type":"ContainerStarted","Data":"f752e4541b769e913ac3b0d8af7e559c30963fa17cfa0958a239f359bea1fb22"} Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.134632 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:fa46fc14710961e6b4a76a3522dca3aa3cfa71436c7cf7ade533d3712822f327\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" podUID="e76712a7-ebf6-4f04-a52c-c8d2bacb87f7" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.134963 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" event={"ID":"783285f4-2e9d-4af5-b017-32676e7d1b01","Type":"ContainerStarted","Data":"ed6acc5f50bf0655622492cac0b3f0b4b220be2099e1a5d911dcf85528df3745"} Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.135665 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/watcher-operator@sha256:162fb83ed76cbf5d44ba057fbeee02a9182fdf02346afadb3e16b2e3627e1940\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" podUID="783285f4-2e9d-4af5-b017-32676e7d1b01" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.135959 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" event={"ID":"fde95124-892b-411a-ba05-fa70927c8838","Type":"ContainerStarted","Data":"779c536711b064a2c67dc5eaa590675460d3c315e4c4be0ba5fe5fb942f89869"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.136590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" event={"ID":"e7fc5297-101a-496e-a7c6-e7296e08a5af","Type":"ContainerStarted","Data":"f223d3b2deea88b7b47d9c6cfd2561cd263d36b3f2656a548273fd94975efdad"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.137493 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" event={"ID":"b73c175a-e89e-434f-996a-65c1140bb8dd","Type":"ContainerStarted","Data":"217e47d1ea1e577f9f3d26d127deeb429193e347283ee4598ecaa45a8a6c4104"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.138130 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" event={"ID":"2df85221-33ed-49be-949c-516810279e4d","Type":"ContainerStarted","Data":"923cea042cc049643b9f18ac2932392946acb25bc60a1395d9554055be88e0fd"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.139149 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" event={"ID":"4c63a702-50b9-42f3-858e-7e27da0a8d8f","Type":"ContainerStarted","Data":"d39c45eba42b53622727a2983e3648d45ebd67d17e99c62446159a41354d110a"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.142503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" event={"ID":"abaf1142-1b7c-4987-8a9d-c91e6456c4a5","Type":"ContainerStarted","Data":"5d0e7b142093081addf02d817aa0ec9a28effd35fcb9f32597195a91e4afc707"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.144382 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" event={"ID":"c09741c3-6bae-487a-9b4c-7c9f01d8c5bf","Type":"ContainerStarted","Data":"ccbef40d188785ee8381011d390820b0ef2381f4740dc83e4e0fcb6544f9a8fb"} Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.146224 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" podUID="c09741c3-6bae-487a-9b4c-7c9f01d8c5bf" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.148545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" event={"ID":"783d8159-e67a-4796-83d8-4eff27d79505","Type":"ContainerStarted","Data":"f39c14e2a877fc5caf6155f90f1f56d6b7d577ac42e65ad3f28efe83d3254529"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.149635 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" event={"ID":"c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8","Type":"ContainerStarted","Data":"2e3eb37d74672104ba96a984370bae529b30991e71efba38da9c8804708dfe62"} Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.152540 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" podUID="c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.161529 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" event={"ID":"fb300814-fca7-4419-ac6e-c08b33edd4be","Type":"ContainerStarted","Data":"b1121b1aaf2c2d1be47c65ced15bfe248060e0013f876ad5bec34c29761f43c5"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.165403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" event={"ID":"0a88aa66-b634-44ee-8e5b-bfeacb765e57","Type":"ContainerStarted","Data":"4bb52078231b82897e96da9e073eb926d06bf3f15389686c1533aa0597ef25b0"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.166777 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" event={"ID":"6242683c-24ad-4e22-a7b3-8463e07388c2","Type":"ContainerStarted","Data":"f2ece18acf3e1a9e84f9affb02aa64941da9edaa5466989d492565763b3e141f"} Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.167715 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" event={"ID":"80584c24-3c75-4624-802f-e608f640eeaa","Type":"ContainerStarted","Data":"75a5b9256fe3030df211e53616d6aaacfa661c5623aee3562a86d05a5b0c0ee1"} Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.168130 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" podUID="6242683c-24ad-4e22-a7b3-8463e07388c2" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.202880 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.203252 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.203351 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert podName:1389813b-42ea-433f-820c-e5b8b41713d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:26.203327087 +0000 UTC m=+992.183936185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q994c" (UID: "1389813b-42ea-433f-820c-e5b8b41713d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.407820 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:24 crc kubenswrapper[4772]: I0127 15:23:24.407942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.408108 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.408245 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:26.408151246 +0000 UTC m=+992.388760334 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "webhook-server-cert" not found Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.408228 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 15:23:24 crc kubenswrapper[4772]: E0127 15:23:24.408549 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:26.408539478 +0000 UTC m=+992.389148576 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "metrics-server-cert" not found Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.183860 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:445e951df2f21df6d33a466f75917e0f6103052ae751ae11887136e8ab165922\\\"\"" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" podUID="c09741c3-6bae-487a-9b4c-7c9f01d8c5bf" Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.184879 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:fa46fc14710961e6b4a76a3522dca3aa3cfa71436c7cf7ade533d3712822f327\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" podUID="e76712a7-ebf6-4f04-a52c-c8d2bacb87f7" Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.184927 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/telemetry-operator@sha256:1f1fea3b7df89b81756eab8e6f4c9bed01ab7e949a6ce2d7692c260f41dfbc20\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" podUID="e4a99865-64a7-49e5-bdce-ff929105fc0d" Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.185000 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:c8dde42dafd41026ed2e4cfc26efc0fff63c4ba9d31326ae7dc644ccceaafa9d\\\"\"" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" podUID="6242683c-24ad-4e22-a7b3-8463e07388c2" Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.185374 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/watcher-operator@sha256:162fb83ed76cbf5d44ba057fbeee02a9182fdf02346afadb3e16b2e3627e1940\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" podUID="783285f4-2e9d-4af5-b017-32676e7d1b01" Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.185603 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:013c0ad82d21a21c7eece5cd4b5d5c4b8eb410b6671ac33a6f3fb78c8510811d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" podUID="c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8" Jan 27 15:23:25 crc kubenswrapper[4772]: I0127 15:23:25.843531 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.843676 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:25 crc kubenswrapper[4772]: E0127 15:23:25.843723 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert podName:e7465bd0-3b6e-4199-9ee6-28b512198847 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:29.843708996 +0000 UTC m=+995.824318094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert") pod "infra-operator-controller-manager-7d75bc88d5-t54fr" (UID: "e7465bd0-3b6e-4199-9ee6-28b512198847") : secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:26 crc kubenswrapper[4772]: I0127 15:23:26.249767 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:26 crc kubenswrapper[4772]: E0127 15:23:26.249948 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:26 crc kubenswrapper[4772]: E0127 15:23:26.250644 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert podName:1389813b-42ea-433f-820c-e5b8b41713d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:30.250194235 +0000 UTC m=+996.230803333 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q994c" (UID: "1389813b-42ea-433f-820c-e5b8b41713d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:26 crc kubenswrapper[4772]: I0127 15:23:26.453139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:26 crc kubenswrapper[4772]: I0127 15:23:26.453276 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:26 crc kubenswrapper[4772]: E0127 15:23:26.453305 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 15:23:26 crc kubenswrapper[4772]: E0127 15:23:26.453379 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:30.453359077 +0000 UTC m=+996.433968175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "metrics-server-cert" not found Jan 27 15:23:26 crc kubenswrapper[4772]: E0127 15:23:26.453413 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 15:23:26 crc kubenswrapper[4772]: E0127 15:23:26.453473 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:30.45345915 +0000 UTC m=+996.434068248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "webhook-server-cert" not found Jan 27 15:23:29 crc kubenswrapper[4772]: I0127 15:23:29.903501 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:29 crc kubenswrapper[4772]: E0127 15:23:29.903934 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:29 crc kubenswrapper[4772]: E0127 15:23:29.904082 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert podName:e7465bd0-3b6e-4199-9ee6-28b512198847 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:37.904062851 +0000 UTC m=+1003.884671959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert") pod "infra-operator-controller-manager-7d75bc88d5-t54fr" (UID: "e7465bd0-3b6e-4199-9ee6-28b512198847") : secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:30 crc kubenswrapper[4772]: I0127 15:23:30.308051 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:30 crc kubenswrapper[4772]: E0127 15:23:30.308560 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:30 crc kubenswrapper[4772]: E0127 15:23:30.308781 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert podName:1389813b-42ea-433f-820c-e5b8b41713d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:38.308753177 +0000 UTC m=+1004.289362295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q994c" (UID: "1389813b-42ea-433f-820c-e5b8b41713d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:30 crc kubenswrapper[4772]: I0127 15:23:30.512711 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:30 crc kubenswrapper[4772]: I0127 15:23:30.512856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:30 crc kubenswrapper[4772]: E0127 15:23:30.512901 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 15:23:30 crc kubenswrapper[4772]: E0127 15:23:30.512978 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:38.512961229 +0000 UTC m=+1004.493570327 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "webhook-server-cert" not found Jan 27 15:23:30 crc kubenswrapper[4772]: E0127 15:23:30.513095 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 15:23:30 crc kubenswrapper[4772]: E0127 15:23:30.513146 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:38.513131184 +0000 UTC m=+1004.493740282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "metrics-server-cert" not found Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.214550 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zvzxk"] Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.216605 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.228250 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvzxk"] Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.391156 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-utilities\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.391383 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-catalog-content\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.391590 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29mzp\" (UniqueName: \"kubernetes.io/projected/0f2e732b-dbc8-423e-8a37-100a97dad4f0-kube-api-access-29mzp\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.492806 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29mzp\" (UniqueName: \"kubernetes.io/projected/0f2e732b-dbc8-423e-8a37-100a97dad4f0-kube-api-access-29mzp\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.492891 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-utilities\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.492962 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-catalog-content\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.493525 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-catalog-content\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.493596 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-utilities\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.533300 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29mzp\" (UniqueName: \"kubernetes.io/projected/0f2e732b-dbc8-423e-8a37-100a97dad4f0-kube-api-access-29mzp\") pod \"community-operators-zvzxk\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:35 crc kubenswrapper[4772]: I0127 15:23:35.575270 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.393777 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/glance-operator@sha256:bc45409dff26aca6bd982684cfaf093548adb6a71928f5257fe60ab5535dda39" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.394481 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/glance-operator@sha256:bc45409dff26aca6bd982684cfaf093548adb6a71928f5257fe60ab5535dda39,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-snjkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-67dd55ff59-hgscb_openstack-operators(fb300814-fca7-4419-ac6e-c08b33edd4be): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.395864 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" podUID="fb300814-fca7-4419-ac6e-c08b33edd4be" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.910779 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/ironic-operator@sha256:30e2224475338d3a02d617ae147dc7dc09867cce4ac3543b313a1923c46299fa" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.910990 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/ironic-operator@sha256:30e2224475338d3a02d617ae147dc7dc09867cce4ac3543b313a1923c46299fa,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zs77k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-768b776ffb-sxbjn_openstack-operators(2df85221-33ed-49be-949c-516810279e4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.912226 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" podUID="2df85221-33ed-49be-949c-516810279e4d" Jan 27 15:23:37 crc kubenswrapper[4772]: I0127 15:23:37.933180 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.933353 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:37 crc kubenswrapper[4772]: E0127 15:23:37.933434 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert podName:e7465bd0-3b6e-4199-9ee6-28b512198847 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:53.933414493 +0000 UTC m=+1019.914023591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert") pod "infra-operator-controller-manager-7d75bc88d5-t54fr" (UID: "e7465bd0-3b6e-4199-9ee6-28b512198847") : secret "infra-operator-webhook-server-cert" not found Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.287597 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/ironic-operator@sha256:30e2224475338d3a02d617ae147dc7dc09867cce4ac3543b313a1923c46299fa\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" podUID="2df85221-33ed-49be-949c-516810279e4d" Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.288052 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/glance-operator@sha256:bc45409dff26aca6bd982684cfaf093548adb6a71928f5257fe60ab5535dda39\\\"\"" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" podUID="fb300814-fca7-4419-ac6e-c08b33edd4be" Jan 27 15:23:38 crc kubenswrapper[4772]: I0127 15:23:38.338757 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.338970 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.339053 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert podName:1389813b-42ea-433f-820c-e5b8b41713d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:23:54.339031986 +0000 UTC m=+1020.319641124 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854q994c" (UID: "1389813b-42ea-433f-820c-e5b8b41713d7") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 15:23:38 crc kubenswrapper[4772]: I0127 15:23:38.542127 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:38 crc kubenswrapper[4772]: I0127 15:23:38.542227 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.542399 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.542447 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.542464 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:54.542447266 +0000 UTC m=+1020.523056364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "metrics-server-cert" not found Jan 27 15:23:38 crc kubenswrapper[4772]: E0127 15:23:38.542537 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs podName:8087d8d3-d2f6-4bca-abec-f5b5335f26fa nodeName:}" failed. No retries permitted until 2026-01-27 15:23:54.542517988 +0000 UTC m=+1020.523127086 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs") pod "openstack-operator-controller-manager-ff554fc88-clt4p" (UID: "8087d8d3-d2f6-4bca-abec-f5b5335f26fa") : secret "webhook-server-cert" not found Jan 27 15:23:39 crc kubenswrapper[4772]: E0127 15:23:39.585718 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/designate-operator@sha256:d26a32730ba8b64e98f68194bd1a766aadc942392b24fa6a2cf1c136969dd99f" Jan 27 15:23:39 crc kubenswrapper[4772]: E0127 15:23:39.585930 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/designate-operator@sha256:d26a32730ba8b64e98f68194bd1a766aadc942392b24fa6a2cf1c136969dd99f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4xv2c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-77554cdc5c-tkr6j_openstack-operators(d395f105-54f0-4497-a119-57802be313a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:39 crc kubenswrapper[4772]: E0127 15:23:39.599204 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" podUID="d395f105-54f0-4497-a119-57802be313a3" Jan 27 15:23:40 crc kubenswrapper[4772]: E0127 15:23:40.082940 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:b673f00227298dcfa89abb46f8296a0825add42da41e8a4bf4dd13367c738d84" Jan 27 15:23:40 crc kubenswrapper[4772]: E0127 15:23:40.083132 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:b673f00227298dcfa89abb46f8296a0825add42da41e8a4bf4dd13367c738d84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t8n6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6b9fb5fdcb-zhd82_openstack-operators(b73c175a-e89e-434f-996a-65c1140bb8dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:40 crc kubenswrapper[4772]: E0127 15:23:40.084344 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" podUID="b73c175a-e89e-434f-996a-65c1140bb8dd" Jan 27 15:23:40 crc kubenswrapper[4772]: E0127 15:23:40.299184 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:b673f00227298dcfa89abb46f8296a0825add42da41e8a4bf4dd13367c738d84\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" podUID="b73c175a-e89e-434f-996a-65c1140bb8dd" Jan 27 15:23:40 crc kubenswrapper[4772]: E0127 15:23:40.301094 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/designate-operator@sha256:d26a32730ba8b64e98f68194bd1a766aadc942392b24fa6a2cf1c136969dd99f\\\"\"" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" podUID="d395f105-54f0-4497-a119-57802be313a3" Jan 27 15:23:41 crc kubenswrapper[4772]: E0127 15:23:41.582661 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/keystone-operator@sha256:008a2e338430e7dd513f81f66320cc5c1332c332a3191b537d75786489d7f487" Jan 27 15:23:41 crc kubenswrapper[4772]: E0127 15:23:41.583127 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/keystone-operator@sha256:008a2e338430e7dd513f81f66320cc5c1332c332a3191b537d75786489d7f487,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dnnt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-55f684fd56-wzjrz_openstack-operators(783d8159-e67a-4796-83d8-4eff27d79505): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:41 crc kubenswrapper[4772]: E0127 15:23:41.584333 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" podUID="783d8159-e67a-4796-83d8-4eff27d79505" Jan 27 15:23:42 crc kubenswrapper[4772]: E0127 15:23:42.331463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/keystone-operator@sha256:008a2e338430e7dd513f81f66320cc5c1332c332a3191b537d75786489d7f487\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" podUID="783d8159-e67a-4796-83d8-4eff27d79505" Jan 27 15:23:43 crc kubenswrapper[4772]: E0127 15:23:43.574359 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 27 15:23:43 crc kubenswrapper[4772]: E0127 15:23:43.574538 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t57ck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-h9297_openstack-operators(abaf1142-1b7c-4987-8a9d-c91e6456c4a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:43 crc kubenswrapper[4772]: E0127 15:23:43.575677 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" podUID="abaf1142-1b7c-4987-8a9d-c91e6456c4a5" Jan 27 15:23:44 crc kubenswrapper[4772]: E0127 15:23:44.018874 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/nova-operator@sha256:a992613466db3478a00c20c28639c4a12f6326aa52c40a418d1ec40038c83b61" Jan 27 15:23:44 crc kubenswrapper[4772]: E0127 15:23:44.019068 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/nova-operator@sha256:a992613466db3478a00c20c28639c4a12f6326aa52c40a418d1ec40038c83b61,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gbb4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-ddcbfd695-6wltn_openstack-operators(e7fc5297-101a-496e-a7c6-e7296e08a5af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:23:44 crc kubenswrapper[4772]: E0127 15:23:44.020463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" podUID="e7fc5297-101a-496e-a7c6-e7296e08a5af" Jan 27 15:23:44 crc kubenswrapper[4772]: E0127 15:23:44.342505 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/nova-operator@sha256:a992613466db3478a00c20c28639c4a12f6326aa52c40a418d1ec40038c83b61\\\"\"" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" podUID="e7fc5297-101a-496e-a7c6-e7296e08a5af" Jan 27 15:23:44 crc kubenswrapper[4772]: E0127 15:23:44.344678 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" podUID="abaf1142-1b7c-4987-8a9d-c91e6456c4a5" Jan 27 15:23:44 crc kubenswrapper[4772]: I0127 15:23:44.405847 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvzxk"] Jan 27 15:23:45 crc kubenswrapper[4772]: W0127 15:23:45.544589 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2e732b_dbc8_423e_8a37_100a97dad4f0.slice/crio-0a190d2f48d8673469fb8f939468a4f2e791b7ede4b24560514885c4a4400b46 WatchSource:0}: Error finding container 0a190d2f48d8673469fb8f939468a4f2e791b7ede4b24560514885c4a4400b46: Status 404 returned error can't find the container with id 0a190d2f48d8673469fb8f939468a4f2e791b7ede4b24560514885c4a4400b46 Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.369605 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" event={"ID":"0a88aa66-b634-44ee-8e5b-bfeacb765e57","Type":"ContainerStarted","Data":"61c1a5d21ba7cf1ceb1dae1e67a1af2da7cfe582c92cd7a092dab87ddc196a3e"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.369890 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.371467 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" event={"ID":"c09741c3-6bae-487a-9b4c-7c9f01d8c5bf","Type":"ContainerStarted","Data":"c228aa40cd004f4965b5cdfda9f435ff3667ece11ee27b9b4ff3a639d6972431"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.371610 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.372668 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" event={"ID":"fde95124-892b-411a-ba05-fa70927c8838","Type":"ContainerStarted","Data":"2ddf9e7acdd771dd121c6f9d9dd9eee33de1eb9a1f7294bef8ef1c515b49ac9c"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.372735 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.373963 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerID="8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee" exitCode=0 Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.373995 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerDied","Data":"8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.374023 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerStarted","Data":"0a190d2f48d8673469fb8f939468a4f2e791b7ede4b24560514885c4a4400b46"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.375276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" event={"ID":"783285f4-2e9d-4af5-b017-32676e7d1b01","Type":"ContainerStarted","Data":"822d53489c2003e235600c078a444653771377e0334bc7cc3aedeac351b843f4"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.375396 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.376977 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" event={"ID":"6242683c-24ad-4e22-a7b3-8463e07388c2","Type":"ContainerStarted","Data":"f00bbc75046fbf41d8ec762fd40a7980cd00ad96650abfc4ac7521f91bdee7b4"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.377116 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.378301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" event={"ID":"4c63a702-50b9-42f3-858e-7e27da0a8d8f","Type":"ContainerStarted","Data":"ad20cd623ff931fc39aedbefd9e6d702c1605209542c84765b9843c8623057c5"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.378413 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.379526 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" event={"ID":"e76712a7-ebf6-4f04-a52c-c8d2bacb87f7","Type":"ContainerStarted","Data":"4f465583215e8b9affb764b7283ff1474fcd8e3110d6c381e44d1b80d0cbfa4c"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.379894 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.381208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" event={"ID":"80584c24-3c75-4624-802f-e608f640eeaa","Type":"ContainerStarted","Data":"407a631765036d42cd7fa2e38f2ab5d80a6360a56ce7e0782be0431d4c1e0500"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.381542 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.382905 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" event={"ID":"e4a99865-64a7-49e5-bdce-ff929105fc0d","Type":"ContainerStarted","Data":"e1c3b7fb626f8b787658cc8869351224487fc8c81ad9666b3ac00319adb88188"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.383254 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.384264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" event={"ID":"674f4da6-f50d-4bab-808d-56ab3b9e2cb4","Type":"ContainerStarted","Data":"c9918d5f6cf717de51ccdc59db2e1329ce487f5634a8a46443e865ca86ced277"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.384604 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.386581 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" event={"ID":"27ec5082-c170-465b-b3a3-1f27a545fd71","Type":"ContainerStarted","Data":"917d957605680558b921d84fee15cf764cfec3333e2ce31f348d64b0a4296781"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.386954 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.387955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" event={"ID":"c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8","Type":"ContainerStarted","Data":"82ce2efa073bd7645b0a3f77e0b34b013a044b94a5f60b09cc16f705a749a7d7"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.388360 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.389477 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" event={"ID":"e85aef3a-e235-473c-94cc-1f6237798b3e","Type":"ContainerStarted","Data":"281f7228e075798fa1e0b5437241536e28c11266d9ca7a4a73d0ca4d9a81eff6"} Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.389806 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.408824 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" podStartSLOduration=4.9120735159999995 podStartE2EDuration="24.408808832s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.644001303 +0000 UTC m=+989.624610401" lastFinishedPulling="2026-01-27 15:23:43.140736629 +0000 UTC m=+1009.121345717" observedRunningTime="2026-01-27 15:23:46.388755665 +0000 UTC m=+1012.369364753" watchObservedRunningTime="2026-01-27 15:23:46.408808832 +0000 UTC m=+1012.389417930" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.476251 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" podStartSLOduration=5.607960751 podStartE2EDuration="25.476235361s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.273490369 +0000 UTC m=+989.254099467" lastFinishedPulling="2026-01-27 15:23:43.141764959 +0000 UTC m=+1009.122374077" observedRunningTime="2026-01-27 15:23:46.474860431 +0000 UTC m=+1012.455469529" watchObservedRunningTime="2026-01-27 15:23:46.476235361 +0000 UTC m=+1012.456844449" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.478620 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" podStartSLOduration=4.5739294919999995 podStartE2EDuration="24.478609409s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.237508554 +0000 UTC m=+989.218117652" lastFinishedPulling="2026-01-27 15:23:43.142188471 +0000 UTC m=+1009.122797569" observedRunningTime="2026-01-27 15:23:46.453541508 +0000 UTC m=+1012.434150606" watchObservedRunningTime="2026-01-27 15:23:46.478609409 +0000 UTC m=+1012.459218497" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.533795 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" podStartSLOduration=2.779634278 podStartE2EDuration="24.533777325s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.870489526 +0000 UTC m=+989.851098624" lastFinishedPulling="2026-01-27 15:23:45.624632573 +0000 UTC m=+1011.605241671" observedRunningTime="2026-01-27 15:23:46.528152454 +0000 UTC m=+1012.508761552" watchObservedRunningTime="2026-01-27 15:23:46.533777325 +0000 UTC m=+1012.514386423" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.564776 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" podStartSLOduration=2.743378806 podStartE2EDuration="24.564761826s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.768355339 +0000 UTC m=+989.748964437" lastFinishedPulling="2026-01-27 15:23:45.589738359 +0000 UTC m=+1011.570347457" observedRunningTime="2026-01-27 15:23:46.559620739 +0000 UTC m=+1012.540229837" watchObservedRunningTime="2026-01-27 15:23:46.564761826 +0000 UTC m=+1012.545370924" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.581793 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" podStartSLOduration=2.756173785 podStartE2EDuration="24.581777636s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.828021705 +0000 UTC m=+989.808630823" lastFinishedPulling="2026-01-27 15:23:45.653625576 +0000 UTC m=+1011.634234674" observedRunningTime="2026-01-27 15:23:46.576415362 +0000 UTC m=+1012.557024450" watchObservedRunningTime="2026-01-27 15:23:46.581777636 +0000 UTC m=+1012.562386734" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.602800 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" podStartSLOduration=5.106107756 podStartE2EDuration="24.60278324s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.644444366 +0000 UTC m=+989.625053474" lastFinishedPulling="2026-01-27 15:23:43.14111986 +0000 UTC m=+1009.121728958" observedRunningTime="2026-01-27 15:23:46.5972263 +0000 UTC m=+1012.577835398" watchObservedRunningTime="2026-01-27 15:23:46.60278324 +0000 UTC m=+1012.583392338" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.655342 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" podStartSLOduration=2.7231234840000003 podStartE2EDuration="24.655328061s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.761383409 +0000 UTC m=+989.741992507" lastFinishedPulling="2026-01-27 15:23:45.693587986 +0000 UTC m=+1011.674197084" observedRunningTime="2026-01-27 15:23:46.618843072 +0000 UTC m=+1012.599452170" watchObservedRunningTime="2026-01-27 15:23:46.655328061 +0000 UTC m=+1012.635937159" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.656861 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" podStartSLOduration=2.72923875 podStartE2EDuration="24.656856215s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.753976736 +0000 UTC m=+989.734585834" lastFinishedPulling="2026-01-27 15:23:45.681594201 +0000 UTC m=+1011.662203299" observedRunningTime="2026-01-27 15:23:46.650005348 +0000 UTC m=+1012.630614446" watchObservedRunningTime="2026-01-27 15:23:46.656856215 +0000 UTC m=+1012.637465313" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.671361 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" podStartSLOduration=2.893819282 podStartE2EDuration="24.671343111s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.867438728 +0000 UTC m=+989.848047836" lastFinishedPulling="2026-01-27 15:23:45.644962567 +0000 UTC m=+1011.625571665" observedRunningTime="2026-01-27 15:23:46.667506121 +0000 UTC m=+1012.648115219" watchObservedRunningTime="2026-01-27 15:23:46.671343111 +0000 UTC m=+1012.651952209" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.686221 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" podStartSLOduration=5.401818595 podStartE2EDuration="25.686205589s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:22.856336515 +0000 UTC m=+988.836945613" lastFinishedPulling="2026-01-27 15:23:43.140723509 +0000 UTC m=+1009.121332607" observedRunningTime="2026-01-27 15:23:46.681860904 +0000 UTC m=+1012.662470002" watchObservedRunningTime="2026-01-27 15:23:46.686205589 +0000 UTC m=+1012.666814687" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.711325 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" podStartSLOduration=6.010916579 podStartE2EDuration="25.711310421s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.440838761 +0000 UTC m=+989.421447859" lastFinishedPulling="2026-01-27 15:23:43.141232603 +0000 UTC m=+1009.121841701" observedRunningTime="2026-01-27 15:23:46.709488638 +0000 UTC m=+1012.690097736" watchObservedRunningTime="2026-01-27 15:23:46.711310421 +0000 UTC m=+1012.691919519" Jan 27 15:23:46 crc kubenswrapper[4772]: I0127 15:23:46.728557 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" podStartSLOduration=6.096477309 podStartE2EDuration="25.728535636s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.50964998 +0000 UTC m=+989.490259078" lastFinishedPulling="2026-01-27 15:23:43.141708277 +0000 UTC m=+1009.122317405" observedRunningTime="2026-01-27 15:23:46.72381218 +0000 UTC m=+1012.704421278" watchObservedRunningTime="2026-01-27 15:23:46.728535636 +0000 UTC m=+1012.709144724" Jan 27 15:23:47 crc kubenswrapper[4772]: I0127 15:23:47.436870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerStarted","Data":"bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4"} Jan 27 15:23:48 crc kubenswrapper[4772]: I0127 15:23:48.444962 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerID="bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4" exitCode=0 Jan 27 15:23:48 crc kubenswrapper[4772]: I0127 15:23:48.445034 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerDied","Data":"bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4"} Jan 27 15:23:49 crc kubenswrapper[4772]: I0127 15:23:49.452424 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" event={"ID":"2df85221-33ed-49be-949c-516810279e4d","Type":"ContainerStarted","Data":"d23ba4f41b49c304e45cc62ff09182ed6204bc15d9389cc2650c71ceffe46b93"} Jan 27 15:23:49 crc kubenswrapper[4772]: I0127 15:23:49.452895 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:23:49 crc kubenswrapper[4772]: I0127 15:23:49.453986 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerStarted","Data":"5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3"} Jan 27 15:23:49 crc kubenswrapper[4772]: I0127 15:23:49.470700 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" podStartSLOduration=2.823026481 podStartE2EDuration="28.470679415s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.429531486 +0000 UTC m=+989.410140574" lastFinishedPulling="2026-01-27 15:23:49.07718442 +0000 UTC m=+1015.057793508" observedRunningTime="2026-01-27 15:23:49.467955917 +0000 UTC m=+1015.448565025" watchObservedRunningTime="2026-01-27 15:23:49.470679415 +0000 UTC m=+1015.451288523" Jan 27 15:23:49 crc kubenswrapper[4772]: I0127 15:23:49.490103 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zvzxk" podStartSLOduration=12.029214242 podStartE2EDuration="14.490085213s" podCreationTimestamp="2026-01-27 15:23:35 +0000 UTC" firstStartedPulling="2026-01-27 15:23:46.374858156 +0000 UTC m=+1012.355467254" lastFinishedPulling="2026-01-27 15:23:48.835729117 +0000 UTC m=+1014.816338225" observedRunningTime="2026-01-27 15:23:49.484814772 +0000 UTC m=+1015.465423870" watchObservedRunningTime="2026-01-27 15:23:49.490085213 +0000 UTC m=+1015.470694311" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.092732 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-65ff799cfd-t42n9" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.121286 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-655bf9cfbb-cgh7j" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.200850 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-575ffb885b-mtd9d" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.331863 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-849fcfbb6b-tvrx9" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.525154 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-jcb4p" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.659985 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7ffd8d76d4-gcpj4" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.820823 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7875d7675-ktfbt" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.962575 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6f75f45d54-ww79v" Jan 27 15:23:52 crc kubenswrapper[4772]: I0127 15:23:52.992831 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-79d5ccc684-vwnwk" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.023211 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-547cbdb99f-l8d48" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.047419 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-69797bbcbd-ln7xf" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.109207 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-799bc87c89-k2l8k" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.166821 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c9bb4b66c-ws2mh" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.487491 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" event={"ID":"783d8159-e67a-4796-83d8-4eff27d79505","Type":"ContainerStarted","Data":"eaecf2e53d15afef34bf272d64352d27aa2ae2b41f14db1894bdd3e36d5c565c"} Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.488062 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.489336 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" event={"ID":"d395f105-54f0-4497-a119-57802be313a3","Type":"ContainerStarted","Data":"cf7963476a36c8cfaba13cac3e2de907e23c7953a1ff59b8219b9b83cab2617c"} Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.489675 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.526148 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" podStartSLOduration=1.877660611 podStartE2EDuration="31.526128699s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.45923593 +0000 UTC m=+989.439845038" lastFinishedPulling="2026-01-27 15:23:53.107704028 +0000 UTC m=+1019.088313126" observedRunningTime="2026-01-27 15:23:53.517369407 +0000 UTC m=+1019.497978505" watchObservedRunningTime="2026-01-27 15:23:53.526128699 +0000 UTC m=+1019.506737797" Jan 27 15:23:53 crc kubenswrapper[4772]: I0127 15:23:53.539778 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" podStartSLOduration=2.752698378 podStartE2EDuration="32.539757901s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.322454617 +0000 UTC m=+989.303063715" lastFinishedPulling="2026-01-27 15:23:53.10951414 +0000 UTC m=+1019.090123238" observedRunningTime="2026-01-27 15:23:53.533570303 +0000 UTC m=+1019.514179401" watchObservedRunningTime="2026-01-27 15:23:53.539757901 +0000 UTC m=+1019.520367009" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.007947 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.014317 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7465bd0-3b6e-4199-9ee6-28b512198847-cert\") pod \"infra-operator-controller-manager-7d75bc88d5-t54fr\" (UID: \"e7465bd0-3b6e-4199-9ee6-28b512198847\") " pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.306352 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.416319 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.441814 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1389813b-42ea-433f-820c-e5b8b41713d7-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854q994c\" (UID: \"1389813b-42ea-433f-820c-e5b8b41713d7\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.548020 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr"] Jan 27 15:23:54 crc kubenswrapper[4772]: W0127 15:23:54.556380 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7465bd0_3b6e_4199_9ee6_28b512198847.slice/crio-f24a8e1d9d29185f462c50004e6568f62134d9ff11f32a055852526a91794abe WatchSource:0}: Error finding container f24a8e1d9d29185f462c50004e6568f62134d9ff11f32a055852526a91794abe: Status 404 returned error can't find the container with id f24a8e1d9d29185f462c50004e6568f62134d9ff11f32a055852526a91794abe Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.619409 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.620282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.623364 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-webhook-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.623467 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8087d8d3-d2f6-4bca-abec-f5b5335f26fa-metrics-certs\") pod \"openstack-operator-controller-manager-ff554fc88-clt4p\" (UID: \"8087d8d3-d2f6-4bca-abec-f5b5335f26fa\") " pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.631289 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tbw52" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.639447 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.724650 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-xvq6z" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.733531 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:54 crc kubenswrapper[4772]: I0127 15:23:54.887932 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c"] Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.244443 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p"] Jan 27 15:23:55 crc kubenswrapper[4772]: W0127 15:23:55.249439 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8087d8d3_d2f6_4bca_abec_f5b5335f26fa.slice/crio-ecb882c6aa6eabd50ee7443107232aa8adca0db5010033006e18a3e863cfe81e WatchSource:0}: Error finding container ecb882c6aa6eabd50ee7443107232aa8adca0db5010033006e18a3e863cfe81e: Status 404 returned error can't find the container with id ecb882c6aa6eabd50ee7443107232aa8adca0db5010033006e18a3e863cfe81e Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.508686 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" event={"ID":"8087d8d3-d2f6-4bca-abec-f5b5335f26fa","Type":"ContainerStarted","Data":"ecb882c6aa6eabd50ee7443107232aa8adca0db5010033006e18a3e863cfe81e"} Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.509842 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" event={"ID":"1389813b-42ea-433f-820c-e5b8b41713d7","Type":"ContainerStarted","Data":"4837d19db0cf6c82d296ae460cfe0da09cf8817ae0253d8c591124098258e70c"} Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.510683 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" event={"ID":"e7465bd0-3b6e-4199-9ee6-28b512198847","Type":"ContainerStarted","Data":"f24a8e1d9d29185f462c50004e6568f62134d9ff11f32a055852526a91794abe"} Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.575962 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.576395 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:55 crc kubenswrapper[4772]: I0127 15:23:55.618398 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.518894 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" event={"ID":"fb300814-fca7-4419-ac6e-c08b33edd4be","Type":"ContainerStarted","Data":"a1d99a9a6663762b7be744c7bdf3a480d1c91959ec97aafb336a67f1b0ee6844"} Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.519389 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.524634 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" event={"ID":"8087d8d3-d2f6-4bca-abec-f5b5335f26fa","Type":"ContainerStarted","Data":"a213e3d616c938ea6d49b4267e9e03149d9dca6824648a0b735cb41f8091a8e9"} Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.524831 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.527142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" event={"ID":"b73c175a-e89e-434f-996a-65c1140bb8dd","Type":"ContainerStarted","Data":"02bfcf7788d46afaa4c7f3622e3d45b5db87fa3d03d86c7f62f02d3febf429bb"} Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.527413 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.564663 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" podStartSLOduration=4.957935609 podStartE2EDuration="35.564644311s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.456972435 +0000 UTC m=+989.437581533" lastFinishedPulling="2026-01-27 15:23:54.063681137 +0000 UTC m=+1020.044290235" observedRunningTime="2026-01-27 15:23:56.538222262 +0000 UTC m=+1022.518831370" watchObservedRunningTime="2026-01-27 15:23:56.564644311 +0000 UTC m=+1022.545253409" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.568998 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" podStartSLOduration=34.568979976 podStartE2EDuration="34.568979976s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:23:56.561690477 +0000 UTC m=+1022.542299575" watchObservedRunningTime="2026-01-27 15:23:56.568979976 +0000 UTC m=+1022.549589074" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.580955 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" podStartSLOduration=2.312210718 podStartE2EDuration="34.58093623s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.646224387 +0000 UTC m=+989.626833485" lastFinishedPulling="2026-01-27 15:23:55.914949909 +0000 UTC m=+1021.895558997" observedRunningTime="2026-01-27 15:23:56.579421636 +0000 UTC m=+1022.560030744" watchObservedRunningTime="2026-01-27 15:23:56.58093623 +0000 UTC m=+1022.561545328" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.595458 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:56 crc kubenswrapper[4772]: I0127 15:23:56.650719 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvzxk"] Jan 27 15:23:58 crc kubenswrapper[4772]: I0127 15:23:58.549305 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zvzxk" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="registry-server" containerID="cri-o://5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3" gracePeriod=2 Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.423911 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.556730 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerID="5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3" exitCode=0 Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.556784 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvzxk" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.556833 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerDied","Data":"5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3"} Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.556892 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvzxk" event={"ID":"0f2e732b-dbc8-423e-8a37-100a97dad4f0","Type":"ContainerDied","Data":"0a190d2f48d8673469fb8f939468a4f2e791b7ede4b24560514885c4a4400b46"} Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.556917 4772 scope.go:117] "RemoveContainer" containerID="5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.558796 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" event={"ID":"e7465bd0-3b6e-4199-9ee6-28b512198847","Type":"ContainerStarted","Data":"823b951e47f8646f1f24d273e1575265c88f88f8998dec9dcd2d7cb961c19b15"} Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.558908 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.559950 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" event={"ID":"1389813b-42ea-433f-820c-e5b8b41713d7","Type":"ContainerStarted","Data":"81da4d1513c5f4df54640f0fdd172eaf3bb9691a0fbc6c32b00d9a205863ffa4"} Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.560347 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.583746 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" event={"ID":"abaf1142-1b7c-4987-8a9d-c91e6456c4a5","Type":"ContainerStarted","Data":"0f00940bbe45fb05e0bcfd9ef0cccf5f41c4f0e85045a5ade4492f77f252f39c"} Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.586414 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" event={"ID":"e7fc5297-101a-496e-a7c6-e7296e08a5af","Type":"ContainerStarted","Data":"a306aee00569da20b313031ca959854386d11c011488757e6666ddceb26d0651"} Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.586607 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.590580 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" podStartSLOduration=34.257688451 podStartE2EDuration="38.590566271s" podCreationTimestamp="2026-01-27 15:23:21 +0000 UTC" firstStartedPulling="2026-01-27 15:23:54.557633771 +0000 UTC m=+1020.538242869" lastFinishedPulling="2026-01-27 15:23:58.890511551 +0000 UTC m=+1024.871120689" observedRunningTime="2026-01-27 15:23:59.587450502 +0000 UTC m=+1025.568059600" watchObservedRunningTime="2026-01-27 15:23:59.590566271 +0000 UTC m=+1025.571175359" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.598317 4772 scope.go:117] "RemoveContainer" containerID="bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.598848 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29mzp\" (UniqueName: \"kubernetes.io/projected/0f2e732b-dbc8-423e-8a37-100a97dad4f0-kube-api-access-29mzp\") pod \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.598943 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-catalog-content\") pod \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.599078 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-utilities\") pod \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\" (UID: \"0f2e732b-dbc8-423e-8a37-100a97dad4f0\") " Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.600185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-utilities" (OuterVolumeSpecName: "utilities") pod "0f2e732b-dbc8-423e-8a37-100a97dad4f0" (UID: "0f2e732b-dbc8-423e-8a37-100a97dad4f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.607475 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2e732b-dbc8-423e-8a37-100a97dad4f0-kube-api-access-29mzp" (OuterVolumeSpecName: "kube-api-access-29mzp") pod "0f2e732b-dbc8-423e-8a37-100a97dad4f0" (UID: "0f2e732b-dbc8-423e-8a37-100a97dad4f0"). InnerVolumeSpecName "kube-api-access-29mzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.649938 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" podStartSLOduration=33.647529301 podStartE2EDuration="37.649919768s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:54.892987874 +0000 UTC m=+1020.873596972" lastFinishedPulling="2026-01-27 15:23:58.895378341 +0000 UTC m=+1024.875987439" observedRunningTime="2026-01-27 15:23:59.617493166 +0000 UTC m=+1025.598102264" watchObservedRunningTime="2026-01-27 15:23:59.649919768 +0000 UTC m=+1025.630528866" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.669517 4772 scope.go:117] "RemoveContainer" containerID="8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.673822 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" podStartSLOduration=2.426742101 podStartE2EDuration="37.673806745s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.643432197 +0000 UTC m=+989.624041295" lastFinishedPulling="2026-01-27 15:23:58.890496831 +0000 UTC m=+1024.871105939" observedRunningTime="2026-01-27 15:23:59.66981234 +0000 UTC m=+1025.650421438" watchObservedRunningTime="2026-01-27 15:23:59.673806745 +0000 UTC m=+1025.654415843" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.692528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f2e732b-dbc8-423e-8a37-100a97dad4f0" (UID: "0f2e732b-dbc8-423e-8a37-100a97dad4f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.704147 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29mzp\" (UniqueName: \"kubernetes.io/projected/0f2e732b-dbc8-423e-8a37-100a97dad4f0-kube-api-access-29mzp\") on node \"crc\" DevicePath \"\"" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.704199 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.704208 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2e732b-dbc8-423e-8a37-100a97dad4f0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.707629 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h9297" podStartSLOduration=2.6547782680000003 podStartE2EDuration="37.707618757s" podCreationTimestamp="2026-01-27 15:23:22 +0000 UTC" firstStartedPulling="2026-01-27 15:23:23.838590769 +0000 UTC m=+989.819199867" lastFinishedPulling="2026-01-27 15:23:58.891431258 +0000 UTC m=+1024.872040356" observedRunningTime="2026-01-27 15:23:59.70737155 +0000 UTC m=+1025.687980648" watchObservedRunningTime="2026-01-27 15:23:59.707618757 +0000 UTC m=+1025.688227845" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.725205 4772 scope.go:117] "RemoveContainer" containerID="5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3" Jan 27 15:23:59 crc kubenswrapper[4772]: E0127 15:23:59.725696 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3\": container with ID starting with 5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3 not found: ID does not exist" containerID="5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.725724 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3"} err="failed to get container status \"5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3\": rpc error: code = NotFound desc = could not find container \"5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3\": container with ID starting with 5f0e331299d6605e44535e575ebf4cf5cfa36757bfa41569f0da713738316ff3 not found: ID does not exist" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.725742 4772 scope.go:117] "RemoveContainer" containerID="bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4" Jan 27 15:23:59 crc kubenswrapper[4772]: E0127 15:23:59.730285 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4\": container with ID starting with bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4 not found: ID does not exist" containerID="bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.730331 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4"} err="failed to get container status \"bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4\": rpc error: code = NotFound desc = could not find container \"bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4\": container with ID starting with bb5b9739a5d8e52b4be45c2313787179349baaa953432e2d37f9e46057ccb4e4 not found: ID does not exist" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.730356 4772 scope.go:117] "RemoveContainer" containerID="8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee" Jan 27 15:23:59 crc kubenswrapper[4772]: E0127 15:23:59.732117 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee\": container with ID starting with 8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee not found: ID does not exist" containerID="8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.732137 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee"} err="failed to get container status \"8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee\": rpc error: code = NotFound desc = could not find container \"8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee\": container with ID starting with 8b01102f5cfb35617b002369451b43259452694e6501916f5c0f4f257352eaee not found: ID does not exist" Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.880253 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvzxk"] Jan 27 15:23:59 crc kubenswrapper[4772]: I0127 15:23:59.884707 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zvzxk"] Jan 27 15:24:00 crc kubenswrapper[4772]: I0127 15:24:00.674403 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" path="/var/lib/kubelet/pods/0f2e732b-dbc8-423e-8a37-100a97dad4f0/volumes" Jan 27 15:24:02 crc kubenswrapper[4772]: I0127 15:24:02.136229 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-77554cdc5c-tkr6j" Jan 27 15:24:02 crc kubenswrapper[4772]: I0127 15:24:02.154460 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-67dd55ff59-hgscb" Jan 27 15:24:02 crc kubenswrapper[4772]: I0127 15:24:02.317471 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-768b776ffb-sxbjn" Jan 27 15:24:02 crc kubenswrapper[4772]: I0127 15:24:02.583660 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-55f684fd56-wzjrz" Jan 27 15:24:02 crc kubenswrapper[4772]: I0127 15:24:02.770636 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6b9fb5fdcb-zhd82" Jan 27 15:24:04 crc kubenswrapper[4772]: I0127 15:24:04.316341 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d75bc88d5-t54fr" Jan 27 15:24:04 crc kubenswrapper[4772]: I0127 15:24:04.649791 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854q994c" Jan 27 15:24:04 crc kubenswrapper[4772]: I0127 15:24:04.747006 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-ff554fc88-clt4p" Jan 27 15:24:12 crc kubenswrapper[4772]: I0127 15:24:12.716403 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-ddcbfd695-6wltn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.714008 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8dxn"] Jan 27 15:24:29 crc kubenswrapper[4772]: E0127 15:24:29.716491 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="registry-server" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.716599 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="registry-server" Jan 27 15:24:29 crc kubenswrapper[4772]: E0127 15:24:29.716685 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="extract-content" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.716752 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="extract-content" Jan 27 15:24:29 crc kubenswrapper[4772]: E0127 15:24:29.716822 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="extract-utilities" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.716903 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="extract-utilities" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.717138 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2e732b-dbc8-423e-8a37-100a97dad4f0" containerName="registry-server" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.718738 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.721182 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.721216 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cnqhx" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.721285 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.723238 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.729777 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8dxn"] Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.783487 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6ll7\" (UniqueName: \"kubernetes.io/projected/9e884850-6b45-4657-8c8c-fa8ccdec648d-kube-api-access-h6ll7\") pod \"dnsmasq-dns-675f4bcbfc-t8dxn\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.783851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e884850-6b45-4657-8c8c-fa8ccdec648d-config\") pod \"dnsmasq-dns-675f4bcbfc-t8dxn\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.784254 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bmnxs"] Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.791102 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.793158 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bmnxs"] Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.799552 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.889236 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.889583 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2msw8\" (UniqueName: \"kubernetes.io/projected/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-kube-api-access-2msw8\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.889705 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6ll7\" (UniqueName: \"kubernetes.io/projected/9e884850-6b45-4657-8c8c-fa8ccdec648d-kube-api-access-h6ll7\") pod \"dnsmasq-dns-675f4bcbfc-t8dxn\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.889860 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e884850-6b45-4657-8c8c-fa8ccdec648d-config\") pod \"dnsmasq-dns-675f4bcbfc-t8dxn\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.889979 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-config\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.891328 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e884850-6b45-4657-8c8c-fa8ccdec648d-config\") pod \"dnsmasq-dns-675f4bcbfc-t8dxn\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.911248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6ll7\" (UniqueName: \"kubernetes.io/projected/9e884850-6b45-4657-8c8c-fa8ccdec648d-kube-api-access-h6ll7\") pod \"dnsmasq-dns-675f4bcbfc-t8dxn\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.991114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-config\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.991508 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.991585 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2msw8\" (UniqueName: \"kubernetes.io/projected/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-kube-api-access-2msw8\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.991920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-config\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:29 crc kubenswrapper[4772]: I0127 15:24:29.992461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.015036 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2msw8\" (UniqueName: \"kubernetes.io/projected/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-kube-api-access-2msw8\") pod \"dnsmasq-dns-78dd6ddcc-bmnxs\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.035883 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.114224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.343587 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8dxn"] Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.356400 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.391681 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bmnxs"] Jan 27 15:24:30 crc kubenswrapper[4772]: W0127 15:24:30.393493 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580fdc18_8bdc_4a16_89a4_efd7df1b8a17.slice/crio-621419c58a31ba2a4aca778faec82cf28803b219ef85445b2df7f5a1babfe7d1 WatchSource:0}: Error finding container 621419c58a31ba2a4aca778faec82cf28803b219ef85445b2df7f5a1babfe7d1: Status 404 returned error can't find the container with id 621419c58a31ba2a4aca778faec82cf28803b219ef85445b2df7f5a1babfe7d1 Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.877922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" event={"ID":"580fdc18-8bdc-4a16-89a4-efd7df1b8a17","Type":"ContainerStarted","Data":"621419c58a31ba2a4aca778faec82cf28803b219ef85445b2df7f5a1babfe7d1"} Jan 27 15:24:30 crc kubenswrapper[4772]: I0127 15:24:30.878927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" event={"ID":"9e884850-6b45-4657-8c8c-fa8ccdec648d","Type":"ContainerStarted","Data":"2c9714095ed3f7fd054930137f9e21b4fdd5954c595ecb091f834dc4c0fc5111"} Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.421237 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8dxn"] Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.438220 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5wcr9"] Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.439338 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.450646 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5wcr9"] Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.516542 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.516641 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnqn\" (UniqueName: \"kubernetes.io/projected/ece4b345-2aab-4ee3-a116-366d6b8d7bff-kube-api-access-xmnqn\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.516676 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-config\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.617713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.617797 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnqn\" (UniqueName: \"kubernetes.io/projected/ece4b345-2aab-4ee3-a116-366d6b8d7bff-kube-api-access-xmnqn\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.617826 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-config\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.618779 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-config\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.619298 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-dns-svc\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.636028 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnqn\" (UniqueName: \"kubernetes.io/projected/ece4b345-2aab-4ee3-a116-366d6b8d7bff-kube-api-access-xmnqn\") pod \"dnsmasq-dns-666b6646f7-5wcr9\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:31 crc kubenswrapper[4772]: I0127 15:24:31.757619 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.233948 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bmnxs"] Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.265382 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rsdjb"] Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.266417 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.281695 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rsdjb"] Jan 27 15:24:32 crc kubenswrapper[4772]: W0127 15:24:32.323751 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece4b345_2aab_4ee3_a116_366d6b8d7bff.slice/crio-69b728c61ccbcb54532eee4001562892bf3d12d85c2e89da36c5a93f0401b107 WatchSource:0}: Error finding container 69b728c61ccbcb54532eee4001562892bf3d12d85c2e89da36c5a93f0401b107: Status 404 returned error can't find the container with id 69b728c61ccbcb54532eee4001562892bf3d12d85c2e89da36c5a93f0401b107 Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.323932 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5wcr9"] Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.335431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-config\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.335491 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnjbj\" (UniqueName: \"kubernetes.io/projected/477337bf-a24a-44fd-9c46-38d2e1566b18-kube-api-access-lnjbj\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.335535 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.436896 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.436984 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-config\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.437021 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnjbj\" (UniqueName: \"kubernetes.io/projected/477337bf-a24a-44fd-9c46-38d2e1566b18-kube-api-access-lnjbj\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.438252 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.439002 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-config\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.466333 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnjbj\" (UniqueName: \"kubernetes.io/projected/477337bf-a24a-44fd-9c46-38d2e1566b18-kube-api-access-lnjbj\") pod \"dnsmasq-dns-57d769cc4f-rsdjb\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.584879 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.586415 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.590490 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.590726 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.593567 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.594183 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.594370 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.594404 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.594512 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6zl48" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.595665 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.597513 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.639888 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.639952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/508c3d5b-212a-46da-9a55-de3f35d7019b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640050 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640090 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640118 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8h8d\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-kube-api-access-l8h8d\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640142 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.640188 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/508c3d5b-212a-46da-9a55-de3f35d7019b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.641068 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.641141 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743473 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743633 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/508c3d5b-212a-46da-9a55-de3f35d7019b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743737 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743813 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8h8d\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-kube-api-access-l8h8d\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.743830 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.744503 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/508c3d5b-212a-46da-9a55-de3f35d7019b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.744543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.744596 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.744393 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.744469 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.745033 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.745096 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.745473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.748061 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/508c3d5b-212a-46da-9a55-de3f35d7019b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.748673 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/508c3d5b-212a-46da-9a55-de3f35d7019b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.749489 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.749587 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.762939 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8h8d\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-kube-api-access-l8h8d\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.788877 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " pod="openstack/rabbitmq-server-0" Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.902902 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" event={"ID":"ece4b345-2aab-4ee3-a116-366d6b8d7bff","Type":"ContainerStarted","Data":"69b728c61ccbcb54532eee4001562892bf3d12d85c2e89da36c5a93f0401b107"} Jan 27 15:24:32 crc kubenswrapper[4772]: I0127 15:24:32.914613 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.123967 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rsdjb"] Jan 27 15:24:33 crc kubenswrapper[4772]: W0127 15:24:33.152490 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod477337bf_a24a_44fd_9c46_38d2e1566b18.slice/crio-e3d43f62abf3e7b0d500727260573ad5c5ab13345a55b4ce49116d69dfd50dd4 WatchSource:0}: Error finding container e3d43f62abf3e7b0d500727260573ad5c5ab13345a55b4ce49116d69dfd50dd4: Status 404 returned error can't find the container with id e3d43f62abf3e7b0d500727260573ad5c5ab13345a55b4ce49116d69dfd50dd4 Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.430721 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.432944 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.438371 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.438653 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.438907 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.439191 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.439354 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.439525 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.439691 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-frtbn" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455299 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455365 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455399 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455430 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455450 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbh9\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-kube-api-access-9gbh9\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455483 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76fdbdb1-d48a-4cd1-8372-78887671dce8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455518 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455583 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76fdbdb1-d48a-4cd1-8372-78887671dce8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455612 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.455637 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.464000 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.469717 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:24:33 crc kubenswrapper[4772]: W0127 15:24:33.492524 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod508c3d5b_212a_46da_9a55_de3f35d7019b.slice/crio-044e360ab5ed48dba1c044f12dafd0e510d6847bb09f3238ce3b8c8d2130f226 WatchSource:0}: Error finding container 044e360ab5ed48dba1c044f12dafd0e510d6847bb09f3238ce3b8c8d2130f226: Status 404 returned error can't find the container with id 044e360ab5ed48dba1c044f12dafd0e510d6847bb09f3238ce3b8c8d2130f226 Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556227 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556280 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76fdbdb1-d48a-4cd1-8372-78887671dce8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556406 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556474 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556502 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbh9\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-kube-api-access-9gbh9\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556561 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76fdbdb1-d48a-4cd1-8372-78887671dce8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.556593 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.557205 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.557353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.557495 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.557515 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.557747 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.558291 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.562850 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76fdbdb1-d48a-4cd1-8372-78887671dce8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.564251 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.565294 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76fdbdb1-d48a-4cd1-8372-78887671dce8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.573452 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.574262 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbh9\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-kube-api-access-9gbh9\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.582799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.764747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.917785 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" event={"ID":"477337bf-a24a-44fd-9c46-38d2e1566b18","Type":"ContainerStarted","Data":"e3d43f62abf3e7b0d500727260573ad5c5ab13345a55b4ce49116d69dfd50dd4"} Jan 27 15:24:33 crc kubenswrapper[4772]: I0127 15:24:33.919955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"508c3d5b-212a-46da-9a55-de3f35d7019b","Type":"ContainerStarted","Data":"044e360ab5ed48dba1c044f12dafd0e510d6847bb09f3238ce3b8c8d2130f226"} Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.282662 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:24:34 crc kubenswrapper[4772]: W0127 15:24:34.297432 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76fdbdb1_d48a_4cd1_8372_78887671dce8.slice/crio-09e6c8b66552c99b1f924df5f88d4156d8a5bb2bf8b6bbb8e0fc50cdfa96e1ad WatchSource:0}: Error finding container 09e6c8b66552c99b1f924df5f88d4156d8a5bb2bf8b6bbb8e0fc50cdfa96e1ad: Status 404 returned error can't find the container with id 09e6c8b66552c99b1f924df5f88d4156d8a5bb2bf8b6bbb8e0fc50cdfa96e1ad Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.774645 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.775783 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.781609 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.783760 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.784475 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4dfv4" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.787359 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.787684 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.788678 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879756 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-default\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879814 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879834 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxbj\" (UniqueName: \"kubernetes.io/projected/b1515626-5d79-408d-abc1-cb92abd58f3f-kube-api-access-4rxbj\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879867 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879888 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-kolla-config\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879965 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.879989 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.935848 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"76fdbdb1-d48a-4cd1-8372-78887671dce8","Type":"ContainerStarted","Data":"09e6c8b66552c99b1f924df5f88d4156d8a5bb2bf8b6bbb8e0fc50cdfa96e1ad"} Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982148 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982186 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982244 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-default\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982308 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxbj\" (UniqueName: \"kubernetes.io/projected/b1515626-5d79-408d-abc1-cb92abd58f3f-kube-api-access-4rxbj\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982351 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.982382 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-kolla-config\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.983711 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.983775 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.984264 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-kolla-config\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.984445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.986047 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-default\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.988800 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:34 crc kubenswrapper[4772]: I0127 15:24:34.989853 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:35 crc kubenswrapper[4772]: I0127 15:24:35.002404 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxbj\" (UniqueName: \"kubernetes.io/projected/b1515626-5d79-408d-abc1-cb92abd58f3f-kube-api-access-4rxbj\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:35 crc kubenswrapper[4772]: I0127 15:24:35.019907 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " pod="openstack/openstack-galera-0" Jan 27 15:24:35 crc kubenswrapper[4772]: I0127 15:24:35.118901 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 15:24:35 crc kubenswrapper[4772]: I0127 15:24:35.974650 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.255685 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.257217 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.261211 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.261804 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.262290 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xm8qg" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.268879 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.270206 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhlp\" (UniqueName: \"kubernetes.io/projected/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kube-api-access-kvhlp\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429384 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429491 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429529 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429553 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429620 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429685 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.429732 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532484 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532512 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532545 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhlp\" (UniqueName: \"kubernetes.io/projected/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kube-api-access-kvhlp\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532585 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532657 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532673 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.532688 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.533217 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.533603 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.534058 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.534420 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.536102 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.557230 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.560293 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.563281 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.573231 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.574875 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.576850 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.577242 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.577355 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-5t8xl" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.589446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhlp\" (UniqueName: \"kubernetes.io/projected/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kube-api-access-kvhlp\") pod \"openstack-cell1-galera-0\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.606980 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.643204 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kolla-config\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.643596 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.643642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.643673 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-config-data\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.643726 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75qc\" (UniqueName: \"kubernetes.io/projected/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kube-api-access-r75qc\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.748229 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.748283 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-config-data\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.748329 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r75qc\" (UniqueName: \"kubernetes.io/projected/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kube-api-access-r75qc\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.748378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kolla-config\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.748410 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.749820 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-config-data\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.755391 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kolla-config\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.764840 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.766997 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75qc\" (UniqueName: \"kubernetes.io/projected/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kube-api-access-r75qc\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.772348 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " pod="openstack/memcached-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.888196 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.952508 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1515626-5d79-408d-abc1-cb92abd58f3f","Type":"ContainerStarted","Data":"e699d423eedfd6502021873114f8ac6157951b5b24e3387e2b6a5c652a5f6465"} Jan 27 15:24:36 crc kubenswrapper[4772]: I0127 15:24:36.957216 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 15:24:37 crc kubenswrapper[4772]: I0127 15:24:37.449491 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 15:24:37 crc kubenswrapper[4772]: I0127 15:24:37.562768 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.209096 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.210500 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.218623 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-p4qsl" Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.219005 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.290206 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znb2g\" (UniqueName: \"kubernetes.io/projected/1ef66151-0ea7-4696-9db0-7b6665731670-kube-api-access-znb2g\") pod \"kube-state-metrics-0\" (UID: \"1ef66151-0ea7-4696-9db0-7b6665731670\") " pod="openstack/kube-state-metrics-0" Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.393969 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znb2g\" (UniqueName: \"kubernetes.io/projected/1ef66151-0ea7-4696-9db0-7b6665731670-kube-api-access-znb2g\") pod \"kube-state-metrics-0\" (UID: \"1ef66151-0ea7-4696-9db0-7b6665731670\") " pod="openstack/kube-state-metrics-0" Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.447864 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znb2g\" (UniqueName: \"kubernetes.io/projected/1ef66151-0ea7-4696-9db0-7b6665731670-kube-api-access-znb2g\") pod \"kube-state-metrics-0\" (UID: \"1ef66151-0ea7-4696-9db0-7b6665731670\") " pod="openstack/kube-state-metrics-0" Jan 27 15:24:38 crc kubenswrapper[4772]: I0127 15:24:38.590607 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.763317 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gxjzh"] Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.764765 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.767327 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.768539 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.771097 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-cpjv6" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.776684 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cqx7r"] Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.780311 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.784731 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gxjzh"] Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.796499 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cqx7r"] Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860258 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-run\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860410 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860457 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-etc-ovs\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860531 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-lib\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860566 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-ovn-controller-tls-certs\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860585 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-combined-ca-bundle\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860720 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-log-ovn\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.860884 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvchv\" (UniqueName: \"kubernetes.io/projected/38ebd422-35c5-4682-8a4d-ca9073728d7c-kube-api-access-zvchv\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.861050 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38ebd422-35c5-4682-8a4d-ca9073728d7c-scripts\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.861211 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-log\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.861367 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run-ovn\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.861418 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sclc\" (UniqueName: \"kubernetes.io/projected/220011f2-8778-4a14-82d4-33a07bd33379-kube-api-access-5sclc\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.862160 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/220011f2-8778-4a14-82d4-33a07bd33379-scripts\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963347 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvchv\" (UniqueName: \"kubernetes.io/projected/38ebd422-35c5-4682-8a4d-ca9073728d7c-kube-api-access-zvchv\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963399 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38ebd422-35c5-4682-8a4d-ca9073728d7c-scripts\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-log\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963446 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run-ovn\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963461 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sclc\" (UniqueName: \"kubernetes.io/projected/220011f2-8778-4a14-82d4-33a07bd33379-kube-api-access-5sclc\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963496 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/220011f2-8778-4a14-82d4-33a07bd33379-scripts\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963516 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-run\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963538 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963562 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-etc-ovs\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963604 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-lib\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-ovn-controller-tls-certs\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963641 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-combined-ca-bundle\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.963665 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-log-ovn\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964197 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run-ovn\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964265 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-log-ovn\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964335 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-run\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964405 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-log\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964435 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-etc-ovs\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.964944 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-lib\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.966308 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/220011f2-8778-4a14-82d4-33a07bd33379-scripts\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.966789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38ebd422-35c5-4682-8a4d-ca9073728d7c-scripts\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.969601 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-combined-ca-bundle\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.971450 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-ovn-controller-tls-certs\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.980060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvchv\" (UniqueName: \"kubernetes.io/projected/38ebd422-35c5-4682-8a4d-ca9073728d7c-kube-api-access-zvchv\") pod \"ovn-controller-ovs-cqx7r\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:41 crc kubenswrapper[4772]: I0127 15:24:41.985352 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sclc\" (UniqueName: \"kubernetes.io/projected/220011f2-8778-4a14-82d4-33a07bd33379-kube-api-access-5sclc\") pod \"ovn-controller-gxjzh\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:42 crc kubenswrapper[4772]: I0127 15:24:42.090006 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh" Jan 27 15:24:42 crc kubenswrapper[4772]: I0127 15:24:42.108344 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.222713 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.224418 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.226634 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.226639 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.226709 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.228837 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.230022 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-z6lp4" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.237097 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396430 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396455 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396477 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396527 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wt7\" (UniqueName: \"kubernetes.io/projected/dc34a3a4-ad0b-4154-82c9-728227b19732-kube-api-access-g8wt7\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396557 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396577 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.396601 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-config\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498260 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wt7\" (UniqueName: \"kubernetes.io/projected/dc34a3a4-ad0b-4154-82c9-728227b19732-kube-api-access-g8wt7\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498414 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498446 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498475 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-config\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498530 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498569 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.498977 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.499581 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.499916 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.500290 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-config\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.503475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.507983 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.508917 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.517508 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wt7\" (UniqueName: \"kubernetes.io/projected/dc34a3a4-ad0b-4154-82c9-728227b19732-kube-api-access-g8wt7\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.518949 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:44 crc kubenswrapper[4772]: I0127 15:24:44.567300 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.608627 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.610324 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.612542 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-mvxxf" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.612775 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.612905 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.612948 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.625411 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720401 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720460 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720578 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720626 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720658 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720800 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720857 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.720881 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdkbw\" (UniqueName: \"kubernetes.io/projected/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-kube-api-access-vdkbw\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823362 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823466 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823506 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.823544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdkbw\" (UniqueName: \"kubernetes.io/projected/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-kube-api-access-vdkbw\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.824916 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.825805 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.825984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.826255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-config\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.833863 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.835272 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.836902 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.843102 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdkbw\" (UniqueName: \"kubernetes.io/projected/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-kube-api-access-vdkbw\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.855773 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:45 crc kubenswrapper[4772]: I0127 15:24:45.932411 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 15:24:46 crc kubenswrapper[4772]: I0127 15:24:46.058872 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf619242-7348-4de4-a37e-8ebdc4ca54d7","Type":"ContainerStarted","Data":"9858c0fc9167c8fdb9fe56212a74207375b7ea71449891249cf75618c47eff4b"} Jan 27 15:24:46 crc kubenswrapper[4772]: I0127 15:24:46.060927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66","Type":"ContainerStarted","Data":"022c0f29ec3ec9ea31194094e372dfed87fe074f880cb471419a54885eeba246"} Jan 27 15:24:58 crc kubenswrapper[4772]: E0127 15:24:58.466694 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 27 15:24:58 crc kubenswrapper[4772]: E0127 15:24:58.467644 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rxbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(b1515626-5d79-408d-abc1-cb92abd58f3f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:24:58 crc kubenswrapper[4772]: E0127 15:24:58.468895 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.152787 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.488318 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.488536 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gbh9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(76fdbdb1-d48a-4cd1-8372-78887671dce8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.489693 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.496048 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.496242 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l8h8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(508c3d5b-212a-46da-9a55-de3f35d7019b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:24:59 crc kubenswrapper[4772]: E0127 15:24:59.497474 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" Jan 27 15:24:59 crc kubenswrapper[4772]: I0127 15:24:59.805120 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gxjzh"] Jan 27 15:25:00 crc kubenswrapper[4772]: E0127 15:25:00.163784 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" Jan 27 15:25:00 crc kubenswrapper[4772]: E0127 15:25:00.164064 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" Jan 27 15:25:00 crc kubenswrapper[4772]: E0127 15:25:00.476588 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 27 15:25:00 crc kubenswrapper[4772]: E0127 15:25:00.476876 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmnqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-5wcr9_openstack(ece4b345-2aab-4ee3-a116-366d6b8d7bff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:25:00 crc kubenswrapper[4772]: E0127 15:25:00.478740 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" podUID="ece4b345-2aab-4ee3-a116-366d6b8d7bff" Jan 27 15:25:01 crc kubenswrapper[4772]: W0127 15:25:01.165014 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod220011f2_8778_4a14_82d4_33a07bd33379.slice/crio-3cb1a1a1b7113cd35f8e36164d72f2c95422afc48122fb52f34747897808a62b WatchSource:0}: Error finding container 3cb1a1a1b7113cd35f8e36164d72f2c95422afc48122fb52f34747897808a62b: Status 404 returned error can't find the container with id 3cb1a1a1b7113cd35f8e36164d72f2c95422afc48122fb52f34747897808a62b Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.174828 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" podUID="ece4b345-2aab-4ee3-a116-366d6b8d7bff" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.190528 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.190734 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2msw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bmnxs_openstack(580fdc18-8bdc-4a16-89a4-efd7df1b8a17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.191929 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" podUID="580fdc18-8bdc-4a16-89a4-efd7df1b8a17" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.217785 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.217936 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6ll7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-t8dxn_openstack(9e884850-6b45-4657-8c8c-fa8ccdec648d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.219310 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" podUID="9e884850-6b45-4657-8c8c-fa8ccdec648d" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.229088 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.229948 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lnjbj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-rsdjb_openstack(477337bf-a24a-44fd-9c46-38d2e1566b18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:25:01 crc kubenswrapper[4772]: E0127 15:25:01.231248 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" podUID="477337bf-a24a-44fd-9c46-38d2e1566b18" Jan 27 15:25:01 crc kubenswrapper[4772]: I0127 15:25:01.699825 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:25:01 crc kubenswrapper[4772]: I0127 15:25:01.776400 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 15:25:01 crc kubenswrapper[4772]: W0127 15:25:01.787938 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4270ab9b_f4a9_4d48_9cc2_f25152ee5fb2.slice/crio-2d9f9f123f138892540800ef23f48dae96e200e8a0b42b345d3f87addf089f7e WatchSource:0}: Error finding container 2d9f9f123f138892540800ef23f48dae96e200e8a0b42b345d3f87addf089f7e: Status 404 returned error can't find the container with id 2d9f9f123f138892540800ef23f48dae96e200e8a0b42b345d3f87addf089f7e Jan 27 15:25:01 crc kubenswrapper[4772]: I0127 15:25:01.900852 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cqx7r"] Jan 27 15:25:01 crc kubenswrapper[4772]: W0127 15:25:01.910701 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38ebd422_35c5_4682_8a4d_ca9073728d7c.slice/crio-b4ae3e61c086f91c9c3a7442484ecc85a4bdf545d39601e45239a3351393b9ff WatchSource:0}: Error finding container b4ae3e61c086f91c9c3a7442484ecc85a4bdf545d39601e45239a3351393b9ff: Status 404 returned error can't find the container with id b4ae3e61c086f91c9c3a7442484ecc85a4bdf545d39601e45239a3351393b9ff Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.179001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ef66151-0ea7-4696-9db0-7b6665731670","Type":"ContainerStarted","Data":"fabcd309d9b92ca01d4a1240a11210e76a8e365a872f6471e0b9d641c3e1ff39"} Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.180359 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh" event={"ID":"220011f2-8778-4a14-82d4-33a07bd33379","Type":"ContainerStarted","Data":"3cb1a1a1b7113cd35f8e36164d72f2c95422afc48122fb52f34747897808a62b"} Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.181794 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf619242-7348-4de4-a37e-8ebdc4ca54d7","Type":"ContainerStarted","Data":"0d7ac15f647607d8d8b9ab55f639b5ec78749485b0e54cbc048e0727ed5dbce0"} Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.183225 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66","Type":"ContainerStarted","Data":"faf687181014b14838de86572705cbe5952bdabca1b3fad7e35afc3ce6238c0f"} Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.183427 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.184744 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2","Type":"ContainerStarted","Data":"2d9f9f123f138892540800ef23f48dae96e200e8a0b42b345d3f87addf089f7e"} Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.185752 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerStarted","Data":"b4ae3e61c086f91c9c3a7442484ecc85a4bdf545d39601e45239a3351393b9ff"} Jan 27 15:25:02 crc kubenswrapper[4772]: E0127 15:25:02.187139 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" podUID="477337bf-a24a-44fd-9c46-38d2e1566b18" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.281433 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.862511100999999 podStartE2EDuration="26.281414542s" podCreationTimestamp="2026-01-27 15:24:36 +0000 UTC" firstStartedPulling="2026-01-27 15:24:45.836621882 +0000 UTC m=+1071.817230980" lastFinishedPulling="2026-01-27 15:25:01.255525333 +0000 UTC m=+1087.236134421" observedRunningTime="2026-01-27 15:25:02.275698408 +0000 UTC m=+1088.256307506" watchObservedRunningTime="2026-01-27 15:25:02.281414542 +0000 UTC m=+1088.262023640" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.620475 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.625044 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.772698 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776213 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6ll7\" (UniqueName: \"kubernetes.io/projected/9e884850-6b45-4657-8c8c-fa8ccdec648d-kube-api-access-h6ll7\") pod \"9e884850-6b45-4657-8c8c-fa8ccdec648d\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776278 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e884850-6b45-4657-8c8c-fa8ccdec648d-config\") pod \"9e884850-6b45-4657-8c8c-fa8ccdec648d\" (UID: \"9e884850-6b45-4657-8c8c-fa8ccdec648d\") " Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776310 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-dns-svc\") pod \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776409 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2msw8\" (UniqueName: \"kubernetes.io/projected/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-kube-api-access-2msw8\") pod \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776477 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-config\") pod \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\" (UID: \"580fdc18-8bdc-4a16-89a4-efd7df1b8a17\") " Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776651 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e884850-6b45-4657-8c8c-fa8ccdec648d-config" (OuterVolumeSpecName: "config") pod "9e884850-6b45-4657-8c8c-fa8ccdec648d" (UID: "9e884850-6b45-4657-8c8c-fa8ccdec648d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.776835 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e884850-6b45-4657-8c8c-fa8ccdec648d-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.777152 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-config" (OuterVolumeSpecName: "config") pod "580fdc18-8bdc-4a16-89a4-efd7df1b8a17" (UID: "580fdc18-8bdc-4a16-89a4-efd7df1b8a17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.777701 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "580fdc18-8bdc-4a16-89a4-efd7df1b8a17" (UID: "580fdc18-8bdc-4a16-89a4-efd7df1b8a17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.783771 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-kube-api-access-2msw8" (OuterVolumeSpecName: "kube-api-access-2msw8") pod "580fdc18-8bdc-4a16-89a4-efd7df1b8a17" (UID: "580fdc18-8bdc-4a16-89a4-efd7df1b8a17"). InnerVolumeSpecName "kube-api-access-2msw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.784348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e884850-6b45-4657-8c8c-fa8ccdec648d-kube-api-access-h6ll7" (OuterVolumeSpecName: "kube-api-access-h6ll7") pod "9e884850-6b45-4657-8c8c-fa8ccdec648d" (UID: "9e884850-6b45-4657-8c8c-fa8ccdec648d"). InnerVolumeSpecName "kube-api-access-h6ll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.877980 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2msw8\" (UniqueName: \"kubernetes.io/projected/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-kube-api-access-2msw8\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.878015 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.878025 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6ll7\" (UniqueName: \"kubernetes.io/projected/9e884850-6b45-4657-8c8c-fa8ccdec648d-kube-api-access-h6ll7\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:02 crc kubenswrapper[4772]: I0127 15:25:02.878034 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/580fdc18-8bdc-4a16-89a4-efd7df1b8a17-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:03 crc kubenswrapper[4772]: W0127 15:25:03.112449 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc34a3a4_ad0b_4154_82c9_728227b19732.slice/crio-3fdbe52ce9493d11baa4cae24955832bd683139a5196f3cc920d2743d2fdf8c2 WatchSource:0}: Error finding container 3fdbe52ce9493d11baa4cae24955832bd683139a5196f3cc920d2743d2fdf8c2: Status 404 returned error can't find the container with id 3fdbe52ce9493d11baa4cae24955832bd683139a5196f3cc920d2743d2fdf8c2 Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.199590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" event={"ID":"9e884850-6b45-4657-8c8c-fa8ccdec648d","Type":"ContainerDied","Data":"2c9714095ed3f7fd054930137f9e21b4fdd5954c595ecb091f834dc4c0fc5111"} Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.199640 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-t8dxn" Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.202921 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" event={"ID":"580fdc18-8bdc-4a16-89a4-efd7df1b8a17","Type":"ContainerDied","Data":"621419c58a31ba2a4aca778faec82cf28803b219ef85445b2df7f5a1babfe7d1"} Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.202937 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bmnxs" Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.204281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc34a3a4-ad0b-4154-82c9-728227b19732","Type":"ContainerStarted","Data":"3fdbe52ce9493d11baa4cae24955832bd683139a5196f3cc920d2743d2fdf8c2"} Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.268121 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8dxn"] Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.281125 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-t8dxn"] Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.299160 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bmnxs"] Jan 27 15:25:03 crc kubenswrapper[4772]: I0127 15:25:03.305065 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bmnxs"] Jan 27 15:25:04 crc kubenswrapper[4772]: I0127 15:25:04.673721 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="580fdc18-8bdc-4a16-89a4-efd7df1b8a17" path="/var/lib/kubelet/pods/580fdc18-8bdc-4a16-89a4-efd7df1b8a17/volumes" Jan 27 15:25:04 crc kubenswrapper[4772]: I0127 15:25:04.674445 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e884850-6b45-4657-8c8c-fa8ccdec648d" path="/var/lib/kubelet/pods/9e884850-6b45-4657-8c8c-fa8ccdec648d/volumes" Jan 27 15:25:05 crc kubenswrapper[4772]: I0127 15:25:05.219492 4772 generic.go:334] "Generic (PLEG): container finished" podID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerID="0d7ac15f647607d8d8b9ab55f639b5ec78749485b0e54cbc048e0727ed5dbce0" exitCode=0 Jan 27 15:25:05 crc kubenswrapper[4772]: I0127 15:25:05.219559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf619242-7348-4de4-a37e-8ebdc4ca54d7","Type":"ContainerDied","Data":"0d7ac15f647607d8d8b9ab55f639b5ec78749485b0e54cbc048e0727ed5dbce0"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.228549 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh" event={"ID":"220011f2-8778-4a14-82d4-33a07bd33379","Type":"ContainerStarted","Data":"afc8ab10fea0840566de64c53bc97d22454ee25e120ead660e5999b0da009daf"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.229078 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-gxjzh" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.230528 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf619242-7348-4de4-a37e-8ebdc4ca54d7","Type":"ContainerStarted","Data":"2e743dfaa62b788cb68a4d553d64cf9affaf8ef6e4da1308fddf4dc259167b69"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.232050 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc34a3a4-ad0b-4154-82c9-728227b19732","Type":"ContainerStarted","Data":"abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.233789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2","Type":"ContainerStarted","Data":"abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.234985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerStarted","Data":"c2ec2d9ef51a12150ebe6df637e29030ff2b622c19a7ada45c6cd396c44b8636"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.236053 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ef66151-0ea7-4696-9db0-7b6665731670","Type":"ContainerStarted","Data":"e93f9f446173d4fd985d40db28827a7f313c9dbe0522a2d3003fa93c8ac7de5e"} Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.236213 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.266630 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.074172213 podStartE2EDuration="28.266607566s" podCreationTimestamp="2026-01-27 15:24:38 +0000 UTC" firstStartedPulling="2026-01-27 15:25:01.726247428 +0000 UTC m=+1087.706856526" lastFinishedPulling="2026-01-27 15:25:05.918682781 +0000 UTC m=+1091.899291879" observedRunningTime="2026-01-27 15:25:06.261946682 +0000 UTC m=+1092.242555790" watchObservedRunningTime="2026-01-27 15:25:06.266607566 +0000 UTC m=+1092.247216654" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.267068 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-gxjzh" podStartSLOduration=21.34043953 podStartE2EDuration="25.267059189s" podCreationTimestamp="2026-01-27 15:24:41 +0000 UTC" firstStartedPulling="2026-01-27 15:25:01.175025458 +0000 UTC m=+1087.155634556" lastFinishedPulling="2026-01-27 15:25:05.101645117 +0000 UTC m=+1091.082254215" observedRunningTime="2026-01-27 15:25:06.249475433 +0000 UTC m=+1092.230084531" watchObservedRunningTime="2026-01-27 15:25:06.267059189 +0000 UTC m=+1092.247668287" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.299215 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=15.891828915 podStartE2EDuration="31.299196043s" podCreationTimestamp="2026-01-27 15:24:35 +0000 UTC" firstStartedPulling="2026-01-27 15:24:45.836844049 +0000 UTC m=+1071.817453137" lastFinishedPulling="2026-01-27 15:25:01.244211167 +0000 UTC m=+1087.224820265" observedRunningTime="2026-01-27 15:25:06.296615769 +0000 UTC m=+1092.277224897" watchObservedRunningTime="2026-01-27 15:25:06.299196043 +0000 UTC m=+1092.279805141" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.889650 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.889716 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 27 15:25:06 crc kubenswrapper[4772]: I0127 15:25:06.961543 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 27 15:25:07 crc kubenswrapper[4772]: I0127 15:25:07.247400 4772 generic.go:334] "Generic (PLEG): container finished" podID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerID="c2ec2d9ef51a12150ebe6df637e29030ff2b622c19a7ada45c6cd396c44b8636" exitCode=0 Jan 27 15:25:07 crc kubenswrapper[4772]: I0127 15:25:07.247551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerDied","Data":"c2ec2d9ef51a12150ebe6df637e29030ff2b622c19a7ada45c6cd396c44b8636"} Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.259817 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerStarted","Data":"d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91"} Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.260229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerStarted","Data":"4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b"} Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.260265 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.260283 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.282960 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cqx7r" podStartSLOduration=24.094741048 podStartE2EDuration="27.282942575s" podCreationTimestamp="2026-01-27 15:24:41 +0000 UTC" firstStartedPulling="2026-01-27 15:25:01.913176273 +0000 UTC m=+1087.893785371" lastFinishedPulling="2026-01-27 15:25:05.1013778 +0000 UTC m=+1091.081986898" observedRunningTime="2026-01-27 15:25:08.277226331 +0000 UTC m=+1094.257835429" watchObservedRunningTime="2026-01-27 15:25:08.282942575 +0000 UTC m=+1094.263551673" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.449929 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5wcr9"] Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.498753 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-62ktv"] Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.500121 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.522680 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-62ktv"] Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.577666 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.577732 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m449k\" (UniqueName: \"kubernetes.io/projected/8aead2c0-bb19-4542-8736-67943c23f0c0-kube-api-access-m449k\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.577768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-config\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.679643 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.679779 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m449k\" (UniqueName: \"kubernetes.io/projected/8aead2c0-bb19-4542-8736-67943c23f0c0-kube-api-access-m449k\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.680115 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-config\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.681158 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-config\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.681848 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.701778 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m449k\" (UniqueName: \"kubernetes.io/projected/8aead2c0-bb19-4542-8736-67943c23f0c0-kube-api-access-m449k\") pod \"dnsmasq-dns-7cb5889db5-62ktv\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:08 crc kubenswrapper[4772]: I0127 15:25:08.826621 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.620860 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.627526 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.630071 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.630379 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.630601 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-g7gch" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.630857 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.655518 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.701377 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.701456 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-lock\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.701548 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlv4\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-kube-api-access-mxlv4\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.701708 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.701767 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-cache\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.701798 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef68955-b80c-4732-9e87-0bec53d0b3a0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.802805 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.802865 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-cache\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.802891 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef68955-b80c-4732-9e87-0bec53d0b3a0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.802990 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.803018 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-lock\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.803080 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlv4\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-kube-api-access-mxlv4\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.803286 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.804148 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-lock\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: E0127 15:25:09.804279 4772 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 15:25:09 crc kubenswrapper[4772]: E0127 15:25:09.804300 4772 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 15:25:09 crc kubenswrapper[4772]: E0127 15:25:09.804352 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift podName:3ef68955-b80c-4732-9e87-0bec53d0b3a0 nodeName:}" failed. No retries permitted until 2026-01-27 15:25:10.304330393 +0000 UTC m=+1096.284939491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift") pod "swift-storage-0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0") : configmap "swift-ring-files" not found Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.804851 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-cache\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.822178 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef68955-b80c-4732-9e87-0bec53d0b3a0-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.822247 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlv4\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-kube-api-access-mxlv4\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:09 crc kubenswrapper[4772]: I0127 15:25:09.834661 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.025761 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.107601 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-dns-svc\") pod \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.107657 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnqn\" (UniqueName: \"kubernetes.io/projected/ece4b345-2aab-4ee3-a116-366d6b8d7bff-kube-api-access-xmnqn\") pod \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.107764 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-config\") pod \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\" (UID: \"ece4b345-2aab-4ee3-a116-366d6b8d7bff\") " Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.111954 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-config" (OuterVolumeSpecName: "config") pod "ece4b345-2aab-4ee3-a116-366d6b8d7bff" (UID: "ece4b345-2aab-4ee3-a116-366d6b8d7bff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.112423 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ece4b345-2aab-4ee3-a116-366d6b8d7bff" (UID: "ece4b345-2aab-4ee3-a116-366d6b8d7bff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.113350 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.113371 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ece4b345-2aab-4ee3-a116-366d6b8d7bff-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.117556 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ece4b345-2aab-4ee3-a116-366d6b8d7bff-kube-api-access-xmnqn" (OuterVolumeSpecName: "kube-api-access-xmnqn") pod "ece4b345-2aab-4ee3-a116-366d6b8d7bff" (UID: "ece4b345-2aab-4ee3-a116-366d6b8d7bff"). InnerVolumeSpecName "kube-api-access-xmnqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.201588 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-d4llz"] Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.203098 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.208425 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.211980 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.212183 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.215572 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d4llz"] Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.215580 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnqn\" (UniqueName: \"kubernetes.io/projected/ece4b345-2aab-4ee3-a116-366d6b8d7bff-kube-api-access-xmnqn\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.272929 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" event={"ID":"ece4b345-2aab-4ee3-a116-366d6b8d7bff","Type":"ContainerDied","Data":"69b728c61ccbcb54532eee4001562892bf3d12d85c2e89da36c5a93f0401b107"} Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.272942 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-5wcr9" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.278199 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc34a3a4-ad0b-4154-82c9-728227b19732","Type":"ContainerStarted","Data":"fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed"} Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.283244 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2","Type":"ContainerStarted","Data":"9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4"} Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.295157 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.431232725 podStartE2EDuration="27.295134036s" podCreationTimestamp="2026-01-27 15:24:43 +0000 UTC" firstStartedPulling="2026-01-27 15:25:03.114376344 +0000 UTC m=+1089.094985442" lastFinishedPulling="2026-01-27 15:25:09.978277655 +0000 UTC m=+1095.958886753" observedRunningTime="2026-01-27 15:25:10.29492261 +0000 UTC m=+1096.275531708" watchObservedRunningTime="2026-01-27 15:25:10.295134036 +0000 UTC m=+1096.275743134" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.315419 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.127424414 podStartE2EDuration="26.315400069s" podCreationTimestamp="2026-01-27 15:24:44 +0000 UTC" firstStartedPulling="2026-01-27 15:25:01.790571558 +0000 UTC m=+1087.771180656" lastFinishedPulling="2026-01-27 15:25:09.978547213 +0000 UTC m=+1095.959156311" observedRunningTime="2026-01-27 15:25:10.312557167 +0000 UTC m=+1096.293166265" watchObservedRunningTime="2026-01-27 15:25:10.315400069 +0000 UTC m=+1096.296009167" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.316585 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-dispersionconf\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.316634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-ring-data-devices\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.316753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-combined-ca-bundle\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.316903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-swiftconf\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.317026 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-etc-swift\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.317067 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvbq9\" (UniqueName: \"kubernetes.io/projected/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-kube-api-access-mvbq9\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.317102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-scripts\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.317133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:10 crc kubenswrapper[4772]: E0127 15:25:10.317372 4772 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 15:25:10 crc kubenswrapper[4772]: E0127 15:25:10.317393 4772 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 15:25:10 crc kubenswrapper[4772]: E0127 15:25:10.317432 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift podName:3ef68955-b80c-4732-9e87-0bec53d0b3a0 nodeName:}" failed. No retries permitted until 2026-01-27 15:25:11.317418117 +0000 UTC m=+1097.298027215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift") pod "swift-storage-0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0") : configmap "swift-ring-files" not found Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.351280 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5wcr9"] Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.358475 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-5wcr9"] Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.398725 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-62ktv"] Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.419060 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-combined-ca-bundle\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.419124 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-swiftconf\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.419209 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-etc-swift\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.419231 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbq9\" (UniqueName: \"kubernetes.io/projected/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-kube-api-access-mvbq9\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.420101 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-scripts\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.419723 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-etc-swift\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.420277 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-dispersionconf\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.420341 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-ring-data-devices\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.420704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-scripts\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.420986 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-ring-data-devices\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.426642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-swiftconf\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.428918 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-combined-ca-bundle\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.431302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-dispersionconf\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.438825 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbq9\" (UniqueName: \"kubernetes.io/projected/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-kube-api-access-mvbq9\") pod \"swift-ring-rebalance-d4llz\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.529193 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.675526 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ece4b345-2aab-4ee3-a116-366d6b8d7bff" path="/var/lib/kubelet/pods/ece4b345-2aab-4ee3-a116-366d6b8d7bff/volumes" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.933041 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 27 15:25:10 crc kubenswrapper[4772]: I0127 15:25:10.975017 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-d4llz"] Jan 27 15:25:10 crc kubenswrapper[4772]: W0127 15:25:10.997109 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2313c291_4eb5_4b79_ad9b_b04cd06a1ee9.slice/crio-1fe7af5e79ef4da92cef0b65e4f6ca7d519d839fcd9321afbd7bf43485be0f39 WatchSource:0}: Error finding container 1fe7af5e79ef4da92cef0b65e4f6ca7d519d839fcd9321afbd7bf43485be0f39: Status 404 returned error can't find the container with id 1fe7af5e79ef4da92cef0b65e4f6ca7d519d839fcd9321afbd7bf43485be0f39 Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.010335 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.097423 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.290399 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1515626-5d79-408d-abc1-cb92abd58f3f","Type":"ContainerStarted","Data":"3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39"} Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.291906 4772 generic.go:334] "Generic (PLEG): container finished" podID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerID="932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73" exitCode=0 Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.291940 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" event={"ID":"8aead2c0-bb19-4542-8736-67943c23f0c0","Type":"ContainerDied","Data":"932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73"} Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.291970 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" event={"ID":"8aead2c0-bb19-4542-8736-67943c23f0c0","Type":"ContainerStarted","Data":"80889f4358a04bf7fac97ea1ad80ad55c76754b98341035087589aff0399485d"} Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.293128 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d4llz" event={"ID":"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9","Type":"ContainerStarted","Data":"1fe7af5e79ef4da92cef0b65e4f6ca7d519d839fcd9321afbd7bf43485be0f39"} Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.337297 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:11 crc kubenswrapper[4772]: E0127 15:25:11.337793 4772 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 15:25:11 crc kubenswrapper[4772]: E0127 15:25:11.337822 4772 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 15:25:11 crc kubenswrapper[4772]: E0127 15:25:11.337859 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift podName:3ef68955-b80c-4732-9e87-0bec53d0b3a0 nodeName:}" failed. No retries permitted until 2026-01-27 15:25:13.337844118 +0000 UTC m=+1099.318453216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift") pod "swift-storage-0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0") : configmap "swift-ring-files" not found Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.567947 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 27 15:25:11 crc kubenswrapper[4772]: I0127 15:25:11.607702 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.058934 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.059049 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.300152 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" event={"ID":"8aead2c0-bb19-4542-8736-67943c23f0c0","Type":"ContainerStarted","Data":"80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579"} Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.300687 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.320895 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" podStartSLOduration=3.719444193 podStartE2EDuration="4.320880106s" podCreationTimestamp="2026-01-27 15:25:08 +0000 UTC" firstStartedPulling="2026-01-27 15:25:10.398788127 +0000 UTC m=+1096.379397225" lastFinishedPulling="2026-01-27 15:25:11.00022401 +0000 UTC m=+1096.980833138" observedRunningTime="2026-01-27 15:25:12.318157117 +0000 UTC m=+1098.298766205" watchObservedRunningTime="2026-01-27 15:25:12.320880106 +0000 UTC m=+1098.301489204" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.343419 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.612127 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rsdjb"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.632524 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vrz8v"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.633834 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.636075 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.647152 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vrz8v"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.686187 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vqpfg"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.689128 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vqpfg"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.689312 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.692030 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.763534 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22gz\" (UniqueName: \"kubernetes.io/projected/003a41cd-8661-4d0a-a5b7-4e06e02d3785-kube-api-access-t22gz\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.763662 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.763713 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-config\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.763812 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.822142 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-62ktv"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.853222 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-tltm6"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.863268 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-combined-ca-bundle\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22gz\" (UniqueName: \"kubernetes.io/projected/003a41cd-8661-4d0a-a5b7-4e06e02d3785-kube-api-access-t22gz\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864825 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864844 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovs-rundir\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864865 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmhcm\" (UniqueName: \"kubernetes.io/projected/a490a71b-c33d-4c94-9592-f97d1d315e81-kube-api-access-dmhcm\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864901 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-config\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovn-rundir\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.864987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a490a71b-c33d-4c94-9592-f97d1d315e81-config\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.865013 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.865833 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-ovsdbserver-sb\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.865836 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.866004 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tltm6"] Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.866073 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-dns-svc\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.866855 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-config\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.905825 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22gz\" (UniqueName: \"kubernetes.io/projected/003a41cd-8661-4d0a-a5b7-4e06e02d3785-kube-api-access-t22gz\") pod \"dnsmasq-dns-6c89d5d749-vrz8v\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.933383 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.959076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966182 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovn-rundir\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966290 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966324 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a490a71b-c33d-4c94-9592-f97d1d315e81-config\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966356 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-dns-svc\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966396 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-config\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-combined-ca-bundle\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966468 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovs-rundir\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966486 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmhcm\" (UniqueName: \"kubernetes.io/projected/a490a71b-c33d-4c94-9592-f97d1d315e81-kube-api-access-dmhcm\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966513 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966531 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pzt\" (UniqueName: \"kubernetes.io/projected/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-kube-api-access-p2pzt\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.966553 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovn-rundir\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.967291 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovs-rundir\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.967681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a490a71b-c33d-4c94-9592-f97d1d315e81-config\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.970302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:12 crc kubenswrapper[4772]: I0127 15:25:12.970424 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-combined-ca-bundle\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.029004 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.052789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmhcm\" (UniqueName: \"kubernetes.io/projected/a490a71b-c33d-4c94-9592-f97d1d315e81-kube-api-access-dmhcm\") pod \"ovn-controller-metrics-vqpfg\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.077057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-dns-svc\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.077121 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-config\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.077191 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.077212 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pzt\" (UniqueName: \"kubernetes.io/projected/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-kube-api-access-p2pzt\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.077261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.077976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.078475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-dns-svc\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.078988 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-config\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.079495 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.116602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pzt\" (UniqueName: \"kubernetes.io/projected/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-kube-api-access-p2pzt\") pod \"dnsmasq-dns-698758b865-tltm6\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.201225 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.308455 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.326655 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.347230 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.381319 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:13 crc kubenswrapper[4772]: E0127 15:25:13.382375 4772 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 15:25:13 crc kubenswrapper[4772]: E0127 15:25:13.382404 4772 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 15:25:13 crc kubenswrapper[4772]: E0127 15:25:13.382461 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift podName:3ef68955-b80c-4732-9e87-0bec53d0b3a0 nodeName:}" failed. No retries permitted until 2026-01-27 15:25:17.382441331 +0000 UTC m=+1103.363050449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift") pod "swift-storage-0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0") : configmap "swift-ring-files" not found Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.515849 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.518308 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.521342 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.521561 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-frj8c" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.521722 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.521876 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.528980 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.583900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.583945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.583969 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-429qb\" (UniqueName: \"kubernetes.io/projected/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-kube-api-access-429qb\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.584040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-scripts\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.584069 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.584111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-config\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.584129 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685630 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685714 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-429qb\" (UniqueName: \"kubernetes.io/projected/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-kube-api-access-429qb\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-scripts\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685851 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685914 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-config\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.685938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.686029 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.686663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-scripts\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.690815 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.691556 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.691719 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-config\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.691783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.703109 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-429qb\" (UniqueName: \"kubernetes.io/projected/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-kube-api-access-429qb\") pod \"ovn-northd-0\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " pod="openstack/ovn-northd-0" Jan 27 15:25:13 crc kubenswrapper[4772]: I0127 15:25:13.839708 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.316105 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerName="dnsmasq-dns" containerID="cri-o://80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579" gracePeriod=10 Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.716673 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.816008 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-config\") pod \"477337bf-a24a-44fd-9c46-38d2e1566b18\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.816375 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnjbj\" (UniqueName: \"kubernetes.io/projected/477337bf-a24a-44fd-9c46-38d2e1566b18-kube-api-access-lnjbj\") pod \"477337bf-a24a-44fd-9c46-38d2e1566b18\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.816412 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-dns-svc\") pod \"477337bf-a24a-44fd-9c46-38d2e1566b18\" (UID: \"477337bf-a24a-44fd-9c46-38d2e1566b18\") " Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.820442 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "477337bf-a24a-44fd-9c46-38d2e1566b18" (UID: "477337bf-a24a-44fd-9c46-38d2e1566b18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.820730 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-config" (OuterVolumeSpecName: "config") pod "477337bf-a24a-44fd-9c46-38d2e1566b18" (UID: "477337bf-a24a-44fd-9c46-38d2e1566b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.827277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/477337bf-a24a-44fd-9c46-38d2e1566b18-kube-api-access-lnjbj" (OuterVolumeSpecName: "kube-api-access-lnjbj") pod "477337bf-a24a-44fd-9c46-38d2e1566b18" (UID: "477337bf-a24a-44fd-9c46-38d2e1566b18"). InnerVolumeSpecName "kube-api-access-lnjbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.858591 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.918794 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m449k\" (UniqueName: \"kubernetes.io/projected/8aead2c0-bb19-4542-8736-67943c23f0c0-kube-api-access-m449k\") pod \"8aead2c0-bb19-4542-8736-67943c23f0c0\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.918884 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-config\") pod \"8aead2c0-bb19-4542-8736-67943c23f0c0\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.918978 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-dns-svc\") pod \"8aead2c0-bb19-4542-8736-67943c23f0c0\" (UID: \"8aead2c0-bb19-4542-8736-67943c23f0c0\") " Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.919415 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.919433 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnjbj\" (UniqueName: \"kubernetes.io/projected/477337bf-a24a-44fd-9c46-38d2e1566b18-kube-api-access-lnjbj\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.919445 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/477337bf-a24a-44fd-9c46-38d2e1566b18-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.927125 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aead2c0-bb19-4542-8736-67943c23f0c0-kube-api-access-m449k" (OuterVolumeSpecName: "kube-api-access-m449k") pod "8aead2c0-bb19-4542-8736-67943c23f0c0" (UID: "8aead2c0-bb19-4542-8736-67943c23f0c0"). InnerVolumeSpecName "kube-api-access-m449k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.954683 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-config" (OuterVolumeSpecName: "config") pod "8aead2c0-bb19-4542-8736-67943c23f0c0" (UID: "8aead2c0-bb19-4542-8736-67943c23f0c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:14 crc kubenswrapper[4772]: I0127 15:25:14.973535 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8aead2c0-bb19-4542-8736-67943c23f0c0" (UID: "8aead2c0-bb19-4542-8736-67943c23f0c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.020849 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.020882 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m449k\" (UniqueName: \"kubernetes.io/projected/8aead2c0-bb19-4542-8736-67943c23f0c0-kube-api-access-m449k\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.020915 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8aead2c0-bb19-4542-8736-67943c23f0c0-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.139052 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.200980 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vqpfg"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.209054 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tltm6"] Jan 27 15:25:15 crc kubenswrapper[4772]: W0127 15:25:15.225494 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda490a71b_c33d_4c94_9592_f97d1d315e81.slice/crio-b17e4736d7a1350c0c68f20fb3327f8519a43cb3b16a163a3b8e79d710328aca WatchSource:0}: Error finding container b17e4736d7a1350c0c68f20fb3327f8519a43cb3b16a163a3b8e79d710328aca: Status 404 returned error can't find the container with id b17e4736d7a1350c0c68f20fb3327f8519a43cb3b16a163a3b8e79d710328aca Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.325756 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vrz8v"] Jan 27 15:25:15 crc kubenswrapper[4772]: W0127 15:25:15.336632 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod003a41cd_8661_4d0a_a5b7_4e06e02d3785.slice/crio-268aefd3f8b14bdf425ab8056006b626765f75c0f77a51699e073edbd88f6c5f WatchSource:0}: Error finding container 268aefd3f8b14bdf425ab8056006b626765f75c0f77a51699e073edbd88f6c5f: Status 404 returned error can't find the container with id 268aefd3f8b14bdf425ab8056006b626765f75c0f77a51699e073edbd88f6c5f Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.336789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938","Type":"ContainerStarted","Data":"383ce19f4879446a46975b1e3757ca75d5dbab13e103b56af11750ee3019f6bc"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.341926 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4jb8m"] Jan 27 15:25:15 crc kubenswrapper[4772]: E0127 15:25:15.346114 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerName="init" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.346156 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerName="init" Jan 27 15:25:15 crc kubenswrapper[4772]: E0127 15:25:15.346257 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerName="dnsmasq-dns" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.346267 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerName="dnsmasq-dns" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.347631 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerName="dnsmasq-dns" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.363319 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4jb8m"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.363356 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d4llz" event={"ID":"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9","Type":"ContainerStarted","Data":"a5599751ce46331dd2a224ba692cd6619979f4eb0205e3a54352eb587e777c31"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.363485 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.366267 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.367736 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tltm6" event={"ID":"01d2ace8-4fbb-4f53-aa31-7557dbaabcce","Type":"ContainerStarted","Data":"67ecd74afacd326820a12dc1cfdc76a179790d7bc04cff09eef9f1e0a03e5d5e"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.369770 4772 generic.go:334] "Generic (PLEG): container finished" podID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerID="3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39" exitCode=0 Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.369890 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1515626-5d79-408d-abc1-cb92abd58f3f","Type":"ContainerDied","Data":"3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.372845 4772 generic.go:334] "Generic (PLEG): container finished" podID="8aead2c0-bb19-4542-8736-67943c23f0c0" containerID="80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579" exitCode=0 Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.372945 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" event={"ID":"8aead2c0-bb19-4542-8736-67943c23f0c0","Type":"ContainerDied","Data":"80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.372975 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" event={"ID":"8aead2c0-bb19-4542-8736-67943c23f0c0","Type":"ContainerDied","Data":"80889f4358a04bf7fac97ea1ad80ad55c76754b98341035087589aff0399485d"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.373021 4772 scope.go:117] "RemoveContainer" containerID="80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.373243 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-62ktv" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.378245 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" event={"ID":"477337bf-a24a-44fd-9c46-38d2e1566b18","Type":"ContainerDied","Data":"e3d43f62abf3e7b0d500727260573ad5c5ab13345a55b4ce49116d69dfd50dd4"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.378317 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rsdjb" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.389453 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vqpfg" event={"ID":"a490a71b-c33d-4c94-9592-f97d1d315e81","Type":"ContainerStarted","Data":"b17e4736d7a1350c0c68f20fb3327f8519a43cb3b16a163a3b8e79d710328aca"} Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.398791 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-d4llz" podStartSLOduration=1.746414787 podStartE2EDuration="5.39876811s" podCreationTimestamp="2026-01-27 15:25:10 +0000 UTC" firstStartedPulling="2026-01-27 15:25:10.99954297 +0000 UTC m=+1096.980152068" lastFinishedPulling="2026-01-27 15:25:14.651896293 +0000 UTC m=+1100.632505391" observedRunningTime="2026-01-27 15:25:15.389581466 +0000 UTC m=+1101.370190584" watchObservedRunningTime="2026-01-27 15:25:15.39876811 +0000 UTC m=+1101.379377208" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.403774 4772 scope.go:117] "RemoveContainer" containerID="932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.426818 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0a9e15e-9947-44be-872f-20072b41a7fc-operator-scripts\") pod \"root-account-create-update-4jb8m\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.426905 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flptk\" (UniqueName: \"kubernetes.io/projected/e0a9e15e-9947-44be-872f-20072b41a7fc-kube-api-access-flptk\") pod \"root-account-create-update-4jb8m\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.480500 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rsdjb"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.485313 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rsdjb"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.494415 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-62ktv"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.500244 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-62ktv"] Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.511377 4772 scope.go:117] "RemoveContainer" containerID="80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579" Jan 27 15:25:15 crc kubenswrapper[4772]: E0127 15:25:15.512503 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579\": container with ID starting with 80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579 not found: ID does not exist" containerID="80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.512569 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579"} err="failed to get container status \"80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579\": rpc error: code = NotFound desc = could not find container \"80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579\": container with ID starting with 80133493ee73cefd49c95662c6d25ef4d357fe61d09a830267d9876af454f579 not found: ID does not exist" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.512602 4772 scope.go:117] "RemoveContainer" containerID="932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73" Jan 27 15:25:15 crc kubenswrapper[4772]: E0127 15:25:15.513275 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73\": container with ID starting with 932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73 not found: ID does not exist" containerID="932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.513328 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73"} err="failed to get container status \"932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73\": rpc error: code = NotFound desc = could not find container \"932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73\": container with ID starting with 932ddd9664bf5058afe13fff46657f7b87008d96d4495fb6b027b04295f1af73 not found: ID does not exist" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.528358 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0a9e15e-9947-44be-872f-20072b41a7fc-operator-scripts\") pod \"root-account-create-update-4jb8m\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.528441 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flptk\" (UniqueName: \"kubernetes.io/projected/e0a9e15e-9947-44be-872f-20072b41a7fc-kube-api-access-flptk\") pod \"root-account-create-update-4jb8m\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.529680 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0a9e15e-9947-44be-872f-20072b41a7fc-operator-scripts\") pod \"root-account-create-update-4jb8m\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.555932 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flptk\" (UniqueName: \"kubernetes.io/projected/e0a9e15e-9947-44be-872f-20072b41a7fc-kube-api-access-flptk\") pod \"root-account-create-update-4jb8m\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:15 crc kubenswrapper[4772]: I0127 15:25:15.734437 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.179532 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4jb8m"] Jan 27 15:25:16 crc kubenswrapper[4772]: W0127 15:25:16.302639 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a9e15e_9947_44be_872f_20072b41a7fc.slice/crio-821dcdb092c7d4fb686ec29c55b5cbeddc94342c181cf08bb0e05a5c49e2d37d WatchSource:0}: Error finding container 821dcdb092c7d4fb686ec29c55b5cbeddc94342c181cf08bb0e05a5c49e2d37d: Status 404 returned error can't find the container with id 821dcdb092c7d4fb686ec29c55b5cbeddc94342c181cf08bb0e05a5c49e2d37d Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.399645 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4jb8m" event={"ID":"e0a9e15e-9947-44be-872f-20072b41a7fc","Type":"ContainerStarted","Data":"821dcdb092c7d4fb686ec29c55b5cbeddc94342c181cf08bb0e05a5c49e2d37d"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.401048 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"76fdbdb1-d48a-4cd1-8372-78887671dce8","Type":"ContainerStarted","Data":"d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.426551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vqpfg" event={"ID":"a490a71b-c33d-4c94-9592-f97d1d315e81","Type":"ContainerStarted","Data":"b93ad84c922746d427d3e2a2deb04a875a239fcafbecb5146ae05b1b11e36a09"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.430742 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"508c3d5b-212a-46da-9a55-de3f35d7019b","Type":"ContainerStarted","Data":"900401625caff4c2d87fe06884c7dcba7f46fdc58e9213b1a6cc2cf36d383e52"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.433330 4772 generic.go:334] "Generic (PLEG): container finished" podID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerID="a76c09aaadcee4723d7ef767396afbe7396ff3e3af040a33171b3953859d1cba" exitCode=0 Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.433390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tltm6" event={"ID":"01d2ace8-4fbb-4f53-aa31-7557dbaabcce","Type":"ContainerDied","Data":"a76c09aaadcee4723d7ef767396afbe7396ff3e3af040a33171b3953859d1cba"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.439508 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1515626-5d79-408d-abc1-cb92abd58f3f","Type":"ContainerStarted","Data":"21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.456364 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vqpfg" podStartSLOduration=4.456346971 podStartE2EDuration="4.456346971s" podCreationTimestamp="2026-01-27 15:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:16.444552281 +0000 UTC m=+1102.425161369" watchObservedRunningTime="2026-01-27 15:25:16.456346971 +0000 UTC m=+1102.436956069" Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.456910 4772 generic.go:334] "Generic (PLEG): container finished" podID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerID="5a47e1d72c8f5eacedc4878d5823874f52e03c455d9958e9166510df3dfd80ce" exitCode=0 Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.456972 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" event={"ID":"003a41cd-8661-4d0a-a5b7-4e06e02d3785","Type":"ContainerDied","Data":"5a47e1d72c8f5eacedc4878d5823874f52e03c455d9958e9166510df3dfd80ce"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.456998 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" event={"ID":"003a41cd-8661-4d0a-a5b7-4e06e02d3785","Type":"ContainerStarted","Data":"268aefd3f8b14bdf425ab8056006b626765f75c0f77a51699e073edbd88f6c5f"} Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.477445 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371993.377354 podStartE2EDuration="43.477422867s" podCreationTimestamp="2026-01-27 15:24:33 +0000 UTC" firstStartedPulling="2026-01-27 15:24:35.998493379 +0000 UTC m=+1061.979102477" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:16.474594455 +0000 UTC m=+1102.455203553" watchObservedRunningTime="2026-01-27 15:25:16.477422867 +0000 UTC m=+1102.458031965" Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.676192 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="477337bf-a24a-44fd-9c46-38d2e1566b18" path="/var/lib/kubelet/pods/477337bf-a24a-44fd-9c46-38d2e1566b18/volumes" Jan 27 15:25:16 crc kubenswrapper[4772]: I0127 15:25:16.676769 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aead2c0-bb19-4542-8736-67943c23f0c0" path="/var/lib/kubelet/pods/8aead2c0-bb19-4542-8736-67943c23f0c0/volumes" Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.469281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" event={"ID":"003a41cd-8661-4d0a-a5b7-4e06e02d3785","Type":"ContainerStarted","Data":"36cd1ff546856c238a871962177bd25e90c208878c18e039ee4f0f710207ad78"} Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.469739 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.470684 4772 generic.go:334] "Generic (PLEG): container finished" podID="e0a9e15e-9947-44be-872f-20072b41a7fc" containerID="6d6c94667c0ae61eab0c4931fc95c11f862c674ae06fd177d824e395ced6b9a6" exitCode=0 Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.470741 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4jb8m" event={"ID":"e0a9e15e-9947-44be-872f-20072b41a7fc","Type":"ContainerDied","Data":"6d6c94667c0ae61eab0c4931fc95c11f862c674ae06fd177d824e395ced6b9a6"} Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.472622 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938","Type":"ContainerStarted","Data":"b1542ba131aec1cffd5520f2969b843d3aa12fe7b4cd60022addce3e73977b99"} Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.472653 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938","Type":"ContainerStarted","Data":"f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a"} Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.472727 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.474520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tltm6" event={"ID":"01d2ace8-4fbb-4f53-aa31-7557dbaabcce","Type":"ContainerStarted","Data":"d1b5117c10f9331477f591f10a624b08ae6968087cc1bb15580ee055f80a719c"} Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.476204 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:17 crc kubenswrapper[4772]: E0127 15:25:17.476388 4772 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 15:25:17 crc kubenswrapper[4772]: E0127 15:25:17.476434 4772 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 15:25:17 crc kubenswrapper[4772]: E0127 15:25:17.476485 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift podName:3ef68955-b80c-4732-9e87-0bec53d0b3a0 nodeName:}" failed. No retries permitted until 2026-01-27 15:25:25.476470244 +0000 UTC m=+1111.457079342 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift") pod "swift-storage-0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0") : configmap "swift-ring-files" not found Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.490183 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" podStartSLOduration=5.490142838 podStartE2EDuration="5.490142838s" podCreationTimestamp="2026-01-27 15:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:17.487635196 +0000 UTC m=+1103.468244314" watchObservedRunningTime="2026-01-27 15:25:17.490142838 +0000 UTC m=+1103.470751936" Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.519481 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.3742372400000002 podStartE2EDuration="4.519459681s" podCreationTimestamp="2026-01-27 15:25:13 +0000 UTC" firstStartedPulling="2026-01-27 15:25:15.216403316 +0000 UTC m=+1101.197012414" lastFinishedPulling="2026-01-27 15:25:16.361625757 +0000 UTC m=+1102.342234855" observedRunningTime="2026-01-27 15:25:17.514025254 +0000 UTC m=+1103.494634352" watchObservedRunningTime="2026-01-27 15:25:17.519459681 +0000 UTC m=+1103.500068779" Jan 27 15:25:17 crc kubenswrapper[4772]: I0127 15:25:17.532067 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-tltm6" podStartSLOduration=5.532037612 podStartE2EDuration="5.532037612s" podCreationTimestamp="2026-01-27 15:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:17.530470797 +0000 UTC m=+1103.511079905" watchObservedRunningTime="2026-01-27 15:25:17.532037612 +0000 UTC m=+1103.512646710" Jan 27 15:25:18 crc kubenswrapper[4772]: I0127 15:25:18.201312 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:18 crc kubenswrapper[4772]: I0127 15:25:18.595363 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 27 15:25:18 crc kubenswrapper[4772]: I0127 15:25:18.826322 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.002493 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0a9e15e-9947-44be-872f-20072b41a7fc-operator-scripts\") pod \"e0a9e15e-9947-44be-872f-20072b41a7fc\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.002563 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flptk\" (UniqueName: \"kubernetes.io/projected/e0a9e15e-9947-44be-872f-20072b41a7fc-kube-api-access-flptk\") pod \"e0a9e15e-9947-44be-872f-20072b41a7fc\" (UID: \"e0a9e15e-9947-44be-872f-20072b41a7fc\") " Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.003339 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a9e15e-9947-44be-872f-20072b41a7fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0a9e15e-9947-44be-872f-20072b41a7fc" (UID: "e0a9e15e-9947-44be-872f-20072b41a7fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.022792 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a9e15e-9947-44be-872f-20072b41a7fc-kube-api-access-flptk" (OuterVolumeSpecName: "kube-api-access-flptk") pod "e0a9e15e-9947-44be-872f-20072b41a7fc" (UID: "e0a9e15e-9947-44be-872f-20072b41a7fc"). InnerVolumeSpecName "kube-api-access-flptk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.104754 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flptk\" (UniqueName: \"kubernetes.io/projected/e0a9e15e-9947-44be-872f-20072b41a7fc-kube-api-access-flptk\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.104795 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0a9e15e-9947-44be-872f-20072b41a7fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.488441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4jb8m" event={"ID":"e0a9e15e-9947-44be-872f-20072b41a7fc","Type":"ContainerDied","Data":"821dcdb092c7d4fb686ec29c55b5cbeddc94342c181cf08bb0e05a5c49e2d37d"} Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.488499 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="821dcdb092c7d4fb686ec29c55b5cbeddc94342c181cf08bb0e05a5c49e2d37d" Jan 27 15:25:19 crc kubenswrapper[4772]: I0127 15:25:19.488458 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4jb8m" Jan 27 15:25:22 crc kubenswrapper[4772]: I0127 15:25:22.961355 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:23 crc kubenswrapper[4772]: I0127 15:25:23.203057 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:23 crc kubenswrapper[4772]: I0127 15:25:23.251906 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vrz8v"] Jan 27 15:25:23 crc kubenswrapper[4772]: I0127 15:25:23.513795 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerName="dnsmasq-dns" containerID="cri-o://36cd1ff546856c238a871962177bd25e90c208878c18e039ee4f0f710207ad78" gracePeriod=10 Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.529839 4772 generic.go:334] "Generic (PLEG): container finished" podID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerID="36cd1ff546856c238a871962177bd25e90c208878c18e039ee4f0f710207ad78" exitCode=0 Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.529933 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" event={"ID":"003a41cd-8661-4d0a-a5b7-4e06e02d3785","Type":"ContainerDied","Data":"36cd1ff546856c238a871962177bd25e90c208878c18e039ee4f0f710207ad78"} Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.833978 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.913692 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22gz\" (UniqueName: \"kubernetes.io/projected/003a41cd-8661-4d0a-a5b7-4e06e02d3785-kube-api-access-t22gz\") pod \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.913753 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-dns-svc\") pod \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.913836 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-ovsdbserver-sb\") pod \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.913952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-config\") pod \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\" (UID: \"003a41cd-8661-4d0a-a5b7-4e06e02d3785\") " Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.929737 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/003a41cd-8661-4d0a-a5b7-4e06e02d3785-kube-api-access-t22gz" (OuterVolumeSpecName: "kube-api-access-t22gz") pod "003a41cd-8661-4d0a-a5b7-4e06e02d3785" (UID: "003a41cd-8661-4d0a-a5b7-4e06e02d3785"). InnerVolumeSpecName "kube-api-access-t22gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.962010 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "003a41cd-8661-4d0a-a5b7-4e06e02d3785" (UID: "003a41cd-8661-4d0a-a5b7-4e06e02d3785"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.962561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "003a41cd-8661-4d0a-a5b7-4e06e02d3785" (UID: "003a41cd-8661-4d0a-a5b7-4e06e02d3785"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:24 crc kubenswrapper[4772]: I0127 15:25:24.980875 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-config" (OuterVolumeSpecName: "config") pod "003a41cd-8661-4d0a-a5b7-4e06e02d3785" (UID: "003a41cd-8661-4d0a-a5b7-4e06e02d3785"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.016070 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.016106 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22gz\" (UniqueName: \"kubernetes.io/projected/003a41cd-8661-4d0a-a5b7-4e06e02d3785-kube-api-access-t22gz\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.016123 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.016134 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/003a41cd-8661-4d0a-a5b7-4e06e02d3785-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.119678 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.119748 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.524579 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.530345 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"swift-storage-0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " pod="openstack/swift-storage-0" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.540184 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" event={"ID":"003a41cd-8661-4d0a-a5b7-4e06e02d3785","Type":"ContainerDied","Data":"268aefd3f8b14bdf425ab8056006b626765f75c0f77a51699e073edbd88f6c5f"} Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.540471 4772 scope.go:117] "RemoveContainer" containerID="36cd1ff546856c238a871962177bd25e90c208878c18e039ee4f0f710207ad78" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.540294 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c89d5d749-vrz8v" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.559867 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.574612 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vrz8v"] Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.577202 4772 scope.go:117] "RemoveContainer" containerID="5a47e1d72c8f5eacedc4878d5823874f52e03c455d9958e9166510df3dfd80ce" Jan 27 15:25:25 crc kubenswrapper[4772]: I0127 15:25:25.581335 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c89d5d749-vrz8v"] Jan 27 15:25:26 crc kubenswrapper[4772]: I0127 15:25:26.097049 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 27 15:25:26 crc kubenswrapper[4772]: I0127 15:25:26.553114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"7e2686f92b31392fd2420828f9959abe37458794a1d13beae3bf48377776f704"} Jan 27 15:25:26 crc kubenswrapper[4772]: I0127 15:25:26.554383 4772 generic.go:334] "Generic (PLEG): container finished" podID="2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" containerID="a5599751ce46331dd2a224ba692cd6619979f4eb0205e3a54352eb587e777c31" exitCode=0 Jan 27 15:25:26 crc kubenswrapper[4772]: I0127 15:25:26.554426 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d4llz" event={"ID":"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9","Type":"ContainerDied","Data":"a5599751ce46331dd2a224ba692cd6619979f4eb0205e3a54352eb587e777c31"} Jan 27 15:25:26 crc kubenswrapper[4772]: I0127 15:25:26.672560 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" path="/var/lib/kubelet/pods/003a41cd-8661-4d0a-a5b7-4e06e02d3785/volumes" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.005629 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.066926 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvbq9\" (UniqueName: \"kubernetes.io/projected/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-kube-api-access-mvbq9\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.066981 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-scripts\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.070317 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-combined-ca-bundle\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.070374 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-ring-data-devices\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.070406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-dispersionconf\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.070459 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-swiftconf\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.070666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-etc-swift\") pod \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\" (UID: \"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9\") " Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.071064 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-kube-api-access-mvbq9" (OuterVolumeSpecName: "kube-api-access-mvbq9") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "kube-api-access-mvbq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.071484 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvbq9\" (UniqueName: \"kubernetes.io/projected/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-kube-api-access-mvbq9\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.072039 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.072300 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.077569 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.091233 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.093315 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-scripts" (OuterVolumeSpecName: "scripts") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.096160 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" (UID: "2313c291-4eb5-4b79-ad9b-b04cd06a1ee9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.172939 4772 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.172972 4772 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.172982 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.172990 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.173000 4772 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.173008 4772 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.570725 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"c3f602f5b8fe5f978c40989adc1d0130c6aaae0dce0fc13d5e34bbe819e8eccb"} Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.571055 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"5f271cd2dcb6b658cde722402c5b2945c28f4d7486cab8c56e064081779416a1"} Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.571065 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"d35aa807e61d39133b8319305719556fcfa6889495c80253864eaf2dc48a450b"} Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.572555 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-d4llz" event={"ID":"2313c291-4eb5-4b79-ad9b-b04cd06a1ee9","Type":"ContainerDied","Data":"1fe7af5e79ef4da92cef0b65e4f6ca7d519d839fcd9321afbd7bf43485be0f39"} Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.572609 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fe7af5e79ef4da92cef0b65e4f6ca7d519d839fcd9321afbd7bf43485be0f39" Jan 27 15:25:28 crc kubenswrapper[4772]: I0127 15:25:28.572638 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-d4llz" Jan 27 15:25:29 crc kubenswrapper[4772]: I0127 15:25:29.043010 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 27 15:25:29 crc kubenswrapper[4772]: I0127 15:25:29.135667 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 27 15:25:29 crc kubenswrapper[4772]: I0127 15:25:29.582320 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"ac32767b3784713a66fbfe32a337398a7461aa8ffad58bbfea7ccf6e3c4ee19d"} Jan 27 15:25:30 crc kubenswrapper[4772]: I0127 15:25:30.594359 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"99c9f47c0720632dfecbfc5e9152885ab96d751677b561767c79f0a032ca5cf5"} Jan 27 15:25:30 crc kubenswrapper[4772]: I0127 15:25:30.595769 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"0c6f6ecf89a4947c23560538762ca73dfe5e13c4acb04e206d91772a3cfc9c49"} Jan 27 15:25:30 crc kubenswrapper[4772]: I0127 15:25:30.595788 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"94e4c588a745acb16ce919a52f7150cf54119c1c41e94c9e658206e6b58958ed"} Jan 27 15:25:30 crc kubenswrapper[4772]: I0127 15:25:30.595815 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"494d3ebaeddb756bf375d2bc394a4b4086ee3e25d9a76747552d41c1f40a9737"} Jan 27 15:25:32 crc kubenswrapper[4772]: I0127 15:25:32.617208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"b0a7c137687a720a7d8c3f84cc586f4b9d3bde7c9bc9e2e0c83a325c2ae23322"} Jan 27 15:25:32 crc kubenswrapper[4772]: I0127 15:25:32.617779 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"8bbb31c1be222187b0e9b27f07c1ac0fe66d8ad583df4ff6b26fec62ab98cf87"} Jan 27 15:25:32 crc kubenswrapper[4772]: I0127 15:25:32.617793 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"71b4242b9081be055bfb8bd2db6959d32259cd0c3ee2b95ddde1c1d2154be74b"} Jan 27 15:25:32 crc kubenswrapper[4772]: I0127 15:25:32.617802 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"bc57f117c387fb10832190ea21f63cdb319308d9390292395fb515e28966d217"} Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.634729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"0b50101071feccad5793667a8f4849d22482c6d522fac228c249d69d6d557cdf"} Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.634779 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"8d889567d10b3e8868d76680ff442da2a14216919aae766c356918ec9960b9a4"} Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.634792 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerStarted","Data":"c1cf3012e8501ba3a809e028a1ab49c960d95fb090a04b4dbca6cd01d2de9524"} Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.677558 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.308887431 podStartE2EDuration="25.67753161s" podCreationTimestamp="2026-01-27 15:25:08 +0000 UTC" firstStartedPulling="2026-01-27 15:25:26.106914043 +0000 UTC m=+1112.087523161" lastFinishedPulling="2026-01-27 15:25:31.475558242 +0000 UTC m=+1117.456167340" observedRunningTime="2026-01-27 15:25:33.669697814 +0000 UTC m=+1119.650306922" watchObservedRunningTime="2026-01-27 15:25:33.67753161 +0000 UTC m=+1119.658140698" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.745705 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4jb8m"] Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.753886 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4jb8m"] Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825311 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-547fc"] Jan 27 15:25:33 crc kubenswrapper[4772]: E0127 15:25:33.825658 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerName="init" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825674 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerName="init" Jan 27 15:25:33 crc kubenswrapper[4772]: E0127 15:25:33.825694 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerName="dnsmasq-dns" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825701 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerName="dnsmasq-dns" Jan 27 15:25:33 crc kubenswrapper[4772]: E0127 15:25:33.825710 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a9e15e-9947-44be-872f-20072b41a7fc" containerName="mariadb-account-create-update" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825716 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a9e15e-9947-44be-872f-20072b41a7fc" containerName="mariadb-account-create-update" Jan 27 15:25:33 crc kubenswrapper[4772]: E0127 15:25:33.825726 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" containerName="swift-ring-rebalance" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825732 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" containerName="swift-ring-rebalance" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825885 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" containerName="swift-ring-rebalance" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825897 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a9e15e-9947-44be-872f-20072b41a7fc" containerName="mariadb-account-create-update" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.825906 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="003a41cd-8661-4d0a-a5b7-4e06e02d3785" containerName="dnsmasq-dns" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.826427 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-547fc" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.831295 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.840766 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-547fc"] Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.915442 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.972775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q5jf\" (UniqueName: \"kubernetes.io/projected/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-kube-api-access-4q5jf\") pod \"root-account-create-update-547fc\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " pod="openstack/root-account-create-update-547fc" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.972868 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-operator-scripts\") pod \"root-account-create-update-547fc\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " pod="openstack/root-account-create-update-547fc" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.993774 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-lll86"] Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.995433 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:33 crc kubenswrapper[4772]: I0127 15:25:33.998027 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.012762 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-lll86"] Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.074027 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-operator-scripts\") pod \"root-account-create-update-547fc\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " pod="openstack/root-account-create-update-547fc" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.074238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q5jf\" (UniqueName: \"kubernetes.io/projected/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-kube-api-access-4q5jf\") pod \"root-account-create-update-547fc\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " pod="openstack/root-account-create-update-547fc" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.074954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-operator-scripts\") pod \"root-account-create-update-547fc\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " pod="openstack/root-account-create-update-547fc" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.096990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q5jf\" (UniqueName: \"kubernetes.io/projected/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-kube-api-access-4q5jf\") pod \"root-account-create-update-547fc\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " pod="openstack/root-account-create-update-547fc" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.140983 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-547fc" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.175861 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.175984 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-config\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.176041 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.176060 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.176134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.176187 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtsjh\" (UniqueName: \"kubernetes.io/projected/b6aa637d-4418-4fa4-8a26-249446d2fb3f-kube-api-access-gtsjh\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.277537 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-config\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.277997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.278040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.278124 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.278220 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtsjh\" (UniqueName: \"kubernetes.io/projected/b6aa637d-4418-4fa4-8a26-249446d2fb3f-kube-api-access-gtsjh\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.278307 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.278950 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-config\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.283845 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.284042 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.285638 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.285663 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.298253 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtsjh\" (UniqueName: \"kubernetes.io/projected/b6aa637d-4418-4fa4-8a26-249446d2fb3f-kube-api-access-gtsjh\") pod \"dnsmasq-dns-77585f5f8c-lll86\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.324124 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:34 crc kubenswrapper[4772]: W0127 15:25:34.558871 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1eaad6_cd29_4189_8ecd_62b7658e69ef.slice/crio-ba7e59e7bcd568f0fef4ca50eaae4a57d525926c5e0abdd0b3a8025ace15dd12 WatchSource:0}: Error finding container ba7e59e7bcd568f0fef4ca50eaae4a57d525926c5e0abdd0b3a8025ace15dd12: Status 404 returned error can't find the container with id ba7e59e7bcd568f0fef4ca50eaae4a57d525926c5e0abdd0b3a8025ace15dd12 Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.562458 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-547fc"] Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.647672 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-547fc" event={"ID":"dd1eaad6-cd29-4189-8ecd-62b7658e69ef","Type":"ContainerStarted","Data":"ba7e59e7bcd568f0fef4ca50eaae4a57d525926c5e0abdd0b3a8025ace15dd12"} Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.677979 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a9e15e-9947-44be-872f-20072b41a7fc" path="/var/lib/kubelet/pods/e0a9e15e-9947-44be-872f-20072b41a7fc/volumes" Jan 27 15:25:34 crc kubenswrapper[4772]: I0127 15:25:34.782993 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-lll86"] Jan 27 15:25:34 crc kubenswrapper[4772]: W0127 15:25:34.783338 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6aa637d_4418_4fa4_8a26_249446d2fb3f.slice/crio-353300bf1914ec8c1fafaa4dfe7633842f95697653e6f9ec7954d70422c9cfbd WatchSource:0}: Error finding container 353300bf1914ec8c1fafaa4dfe7633842f95697653e6f9ec7954d70422c9cfbd: Status 404 returned error can't find the container with id 353300bf1914ec8c1fafaa4dfe7633842f95697653e6f9ec7954d70422c9cfbd Jan 27 15:25:35 crc kubenswrapper[4772]: I0127 15:25:35.656140 4772 generic.go:334] "Generic (PLEG): container finished" podID="dd1eaad6-cd29-4189-8ecd-62b7658e69ef" containerID="7ce8beebc480cca9e2ff0700b901cda6f6e2d53f77d8edbfd7e337a2359ae80a" exitCode=0 Jan 27 15:25:35 crc kubenswrapper[4772]: I0127 15:25:35.656335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-547fc" event={"ID":"dd1eaad6-cd29-4189-8ecd-62b7658e69ef","Type":"ContainerDied","Data":"7ce8beebc480cca9e2ff0700b901cda6f6e2d53f77d8edbfd7e337a2359ae80a"} Jan 27 15:25:35 crc kubenswrapper[4772]: I0127 15:25:35.658114 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerID="26a0610819b472b19e1babe3f9b5893ac7bd92b0c9047d536f0dadb42db99a12" exitCode=0 Jan 27 15:25:35 crc kubenswrapper[4772]: I0127 15:25:35.658147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" event={"ID":"b6aa637d-4418-4fa4-8a26-249446d2fb3f","Type":"ContainerDied","Data":"26a0610819b472b19e1babe3f9b5893ac7bd92b0c9047d536f0dadb42db99a12"} Jan 27 15:25:35 crc kubenswrapper[4772]: I0127 15:25:35.658214 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" event={"ID":"b6aa637d-4418-4fa4-8a26-249446d2fb3f","Type":"ContainerStarted","Data":"353300bf1914ec8c1fafaa4dfe7633842f95697653e6f9ec7954d70422c9cfbd"} Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.473715 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-nmvpf"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.475281 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.486601 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-nmvpf"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.589475 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4c0e-account-create-update-w9dkg"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.591746 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.594834 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.610065 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4c0e-account-create-update-w9dkg"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.622711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zzs9\" (UniqueName: \"kubernetes.io/projected/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-kube-api-access-5zzs9\") pod \"keystone-db-create-nmvpf\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.622933 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-operator-scripts\") pod \"keystone-db-create-nmvpf\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.674302 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" event={"ID":"b6aa637d-4418-4fa4-8a26-249446d2fb3f","Type":"ContainerStarted","Data":"ded8f7e741d736bdfe8cef79d54407ecbfa8926bb6d56e27836f39ea6ec4c8ef"} Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.690300 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" podStartSLOduration=3.690277525 podStartE2EDuration="3.690277525s" podCreationTimestamp="2026-01-27 15:25:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:36.685607421 +0000 UTC m=+1122.666216539" watchObservedRunningTime="2026-01-27 15:25:36.690277525 +0000 UTC m=+1122.670886633" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.724023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzq4h\" (UniqueName: \"kubernetes.io/projected/ef900211-2a44-498c-adb6-fec1abcba5ec-kube-api-access-kzq4h\") pod \"keystone-4c0e-account-create-update-w9dkg\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.724096 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zzs9\" (UniqueName: \"kubernetes.io/projected/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-kube-api-access-5zzs9\") pod \"keystone-db-create-nmvpf\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.724357 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-operator-scripts\") pod \"keystone-db-create-nmvpf\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.724466 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef900211-2a44-498c-adb6-fec1abcba5ec-operator-scripts\") pod \"keystone-4c0e-account-create-update-w9dkg\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.725150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-operator-scripts\") pod \"keystone-db-create-nmvpf\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.741660 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zzs9\" (UniqueName: \"kubernetes.io/projected/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-kube-api-access-5zzs9\") pod \"keystone-db-create-nmvpf\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.795312 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cg94r"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.796862 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cg94r" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.804797 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cg94r"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.826291 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef900211-2a44-498c-adb6-fec1abcba5ec-operator-scripts\") pod \"keystone-4c0e-account-create-update-w9dkg\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.826626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzq4h\" (UniqueName: \"kubernetes.io/projected/ef900211-2a44-498c-adb6-fec1abcba5ec-kube-api-access-kzq4h\") pod \"keystone-4c0e-account-create-update-w9dkg\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.827259 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef900211-2a44-498c-adb6-fec1abcba5ec-operator-scripts\") pod \"keystone-4c0e-account-create-update-w9dkg\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.836773 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.846139 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzq4h\" (UniqueName: \"kubernetes.io/projected/ef900211-2a44-498c-adb6-fec1abcba5ec-kube-api-access-kzq4h\") pod \"keystone-4c0e-account-create-update-w9dkg\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.908974 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e8b1-account-create-update-8rlww"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.910551 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.911237 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.913808 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.916279 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e8b1-account-create-update-8rlww"] Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.927864 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b68551f-119d-4d84-9c91-20e013018b7a-operator-scripts\") pod \"placement-db-create-cg94r\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " pod="openstack/placement-db-create-cg94r" Jan 27 15:25:36 crc kubenswrapper[4772]: I0127 15:25:36.928016 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2fg6\" (UniqueName: \"kubernetes.io/projected/2b68551f-119d-4d84-9c91-20e013018b7a-kube-api-access-s2fg6\") pod \"placement-db-create-cg94r\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " pod="openstack/placement-db-create-cg94r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.029447 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pssdb\" (UniqueName: \"kubernetes.io/projected/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-kube-api-access-pssdb\") pod \"placement-e8b1-account-create-update-8rlww\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.029838 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2fg6\" (UniqueName: \"kubernetes.io/projected/2b68551f-119d-4d84-9c91-20e013018b7a-kube-api-access-s2fg6\") pod \"placement-db-create-cg94r\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " pod="openstack/placement-db-create-cg94r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.029935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b68551f-119d-4d84-9c91-20e013018b7a-operator-scripts\") pod \"placement-db-create-cg94r\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " pod="openstack/placement-db-create-cg94r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.030078 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-operator-scripts\") pod \"placement-e8b1-account-create-update-8rlww\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.030956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b68551f-119d-4d84-9c91-20e013018b7a-operator-scripts\") pod \"placement-db-create-cg94r\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " pod="openstack/placement-db-create-cg94r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.047660 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-547fc" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.051865 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2fg6\" (UniqueName: \"kubernetes.io/projected/2b68551f-119d-4d84-9c91-20e013018b7a-kube-api-access-s2fg6\") pod \"placement-db-create-cg94r\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " pod="openstack/placement-db-create-cg94r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.102719 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dpr42"] Jan 27 15:25:37 crc kubenswrapper[4772]: E0127 15:25:37.103126 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1eaad6-cd29-4189-8ecd-62b7658e69ef" containerName="mariadb-account-create-update" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.103151 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1eaad6-cd29-4189-8ecd-62b7658e69ef" containerName="mariadb-account-create-update" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.103386 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1eaad6-cd29-4189-8ecd-62b7658e69ef" containerName="mariadb-account-create-update" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.104036 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.117892 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dpr42"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.124783 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cg94r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.133184 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-operator-scripts\") pod \"placement-e8b1-account-create-update-8rlww\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.133269 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pssdb\" (UniqueName: \"kubernetes.io/projected/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-kube-api-access-pssdb\") pod \"placement-e8b1-account-create-update-8rlww\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.134022 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-operator-scripts\") pod \"placement-e8b1-account-create-update-8rlww\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.152608 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pssdb\" (UniqueName: \"kubernetes.io/projected/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-kube-api-access-pssdb\") pod \"placement-e8b1-account-create-update-8rlww\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.159954 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.178153 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-gxjzh" podUID="220011f2-8778-4a14-82d4-33a07bd33379" containerName="ovn-controller" probeResult="failure" output=< Jan 27 15:25:37 crc kubenswrapper[4772]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 15:25:37 crc kubenswrapper[4772]: > Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.206742 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a648-account-create-update-qhx8z"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.208967 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.211603 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.213960 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.229858 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a648-account-create-update-qhx8z"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.236864 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q5jf\" (UniqueName: \"kubernetes.io/projected/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-kube-api-access-4q5jf\") pod \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.237041 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-operator-scripts\") pod \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\" (UID: \"dd1eaad6-cd29-4189-8ecd-62b7658e69ef\") " Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.237550 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww95\" (UniqueName: \"kubernetes.io/projected/af586fb2-38ff-4e17-86bc-a7793cb3ac45-kube-api-access-6ww95\") pod \"glance-db-create-dpr42\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.237591 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af586fb2-38ff-4e17-86bc-a7793cb3ac45-operator-scripts\") pod \"glance-db-create-dpr42\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.238071 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd1eaad6-cd29-4189-8ecd-62b7658e69ef" (UID: "dd1eaad6-cd29-4189-8ecd-62b7658e69ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.246101 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-kube-api-access-4q5jf" (OuterVolumeSpecName: "kube-api-access-4q5jf") pod "dd1eaad6-cd29-4189-8ecd-62b7658e69ef" (UID: "dd1eaad6-cd29-4189-8ecd-62b7658e69ef"). InnerVolumeSpecName "kube-api-access-4q5jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.345301 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmjw4\" (UniqueName: \"kubernetes.io/projected/752279e5-88ff-469d-a4db-2942659c7e24-kube-api-access-vmjw4\") pod \"glance-a648-account-create-update-qhx8z\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.345726 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752279e5-88ff-469d-a4db-2942659c7e24-operator-scripts\") pod \"glance-a648-account-create-update-qhx8z\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.345811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww95\" (UniqueName: \"kubernetes.io/projected/af586fb2-38ff-4e17-86bc-a7793cb3ac45-kube-api-access-6ww95\") pod \"glance-db-create-dpr42\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.345843 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af586fb2-38ff-4e17-86bc-a7793cb3ac45-operator-scripts\") pod \"glance-db-create-dpr42\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.345973 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q5jf\" (UniqueName: \"kubernetes.io/projected/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-kube-api-access-4q5jf\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.345992 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1eaad6-cd29-4189-8ecd-62b7658e69ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.346826 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af586fb2-38ff-4e17-86bc-a7793cb3ac45-operator-scripts\") pod \"glance-db-create-dpr42\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.361931 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-nmvpf"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.365976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww95\" (UniqueName: \"kubernetes.io/projected/af586fb2-38ff-4e17-86bc-a7793cb3ac45-kube-api-access-6ww95\") pod \"glance-db-create-dpr42\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.366647 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.430458 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dpr42" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.448139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752279e5-88ff-469d-a4db-2942659c7e24-operator-scripts\") pod \"glance-a648-account-create-update-qhx8z\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.448384 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmjw4\" (UniqueName: \"kubernetes.io/projected/752279e5-88ff-469d-a4db-2942659c7e24-kube-api-access-vmjw4\") pod \"glance-a648-account-create-update-qhx8z\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.449587 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752279e5-88ff-469d-a4db-2942659c7e24-operator-scripts\") pod \"glance-a648-account-create-update-qhx8z\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.453064 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gxjzh-config-c4mg2"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.454117 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.461625 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.467916 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gxjzh-config-c4mg2"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.495032 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmjw4\" (UniqueName: \"kubernetes.io/projected/752279e5-88ff-469d-a4db-2942659c7e24-kube-api-access-vmjw4\") pod \"glance-a648-account-create-update-qhx8z\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.537445 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4c0e-account-create-update-w9dkg"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.550901 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.652872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-additional-scripts\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.652920 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pd4n\" (UniqueName: \"kubernetes.io/projected/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-kube-api-access-2pd4n\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.652990 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run-ovn\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.653064 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-log-ovn\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.653094 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.653110 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-scripts\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.655646 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cg94r"] Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.692382 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-547fc" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.692377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-547fc" event={"ID":"dd1eaad6-cd29-4189-8ecd-62b7658e69ef","Type":"ContainerDied","Data":"ba7e59e7bcd568f0fef4ca50eaae4a57d525926c5e0abdd0b3a8025ace15dd12"} Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.694354 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7e59e7bcd568f0fef4ca50eaae4a57d525926c5e0abdd0b3a8025ace15dd12" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.699866 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cg94r" event={"ID":"2b68551f-119d-4d84-9c91-20e013018b7a","Type":"ContainerStarted","Data":"69a5b31d0e44865250d8bac44f85dd2b9adb78c9d00444c5fa3b2797795ff8b2"} Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.702271 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4c0e-account-create-update-w9dkg" event={"ID":"ef900211-2a44-498c-adb6-fec1abcba5ec","Type":"ContainerStarted","Data":"04dbba6fb62087c40aa12d14c3493253436c9c1687fdba32ecfb0a887e347e49"} Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.705720 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nmvpf" event={"ID":"bbad3a30-e11d-4ae8-9c42-e06b6382c6de","Type":"ContainerStarted","Data":"31d9e486da5aa706768e022e398d969ef41f15c9db5b579c83d50ae160db05a7"} Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.705787 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nmvpf" event={"ID":"bbad3a30-e11d-4ae8-9c42-e06b6382c6de","Type":"ContainerStarted","Data":"8df1e3e2195f10d57dae3b8ddab8a9f5430c67dd9cc313305d8caf2dd324bd9e"} Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.706939 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.726455 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-nmvpf" podStartSLOduration=1.726386395 podStartE2EDuration="1.726386395s" podCreationTimestamp="2026-01-27 15:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:37.72273344 +0000 UTC m=+1123.703342538" watchObservedRunningTime="2026-01-27 15:25:37.726386395 +0000 UTC m=+1123.706995493" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.755067 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run-ovn\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.755124 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-log-ovn\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.755146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.755179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-scripts\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.755249 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-additional-scripts\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.755282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pd4n\" (UniqueName: \"kubernetes.io/projected/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-kube-api-access-2pd4n\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.756114 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run-ovn\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.756180 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-log-ovn\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.756220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.758217 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-scripts\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.758711 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-additional-scripts\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.777635 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pd4n\" (UniqueName: \"kubernetes.io/projected/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-kube-api-access-2pd4n\") pod \"ovn-controller-gxjzh-config-c4mg2\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.781720 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:37 crc kubenswrapper[4772]: I0127 15:25:37.954051 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e8b1-account-create-update-8rlww"] Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.061067 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dpr42"] Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.124039 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a648-account-create-update-qhx8z"] Jan 27 15:25:38 crc kubenswrapper[4772]: W0127 15:25:38.138692 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod752279e5_88ff_469d_a4db_2942659c7e24.slice/crio-71e0dffca8f1bcda6eb557e7af38812a722313603f4af46f0a96549446e9a419 WatchSource:0}: Error finding container 71e0dffca8f1bcda6eb557e7af38812a722313603f4af46f0a96549446e9a419: Status 404 returned error can't find the container with id 71e0dffca8f1bcda6eb557e7af38812a722313603f4af46f0a96549446e9a419 Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.276610 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gxjzh-config-c4mg2"] Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.718026 4772 generic.go:334] "Generic (PLEG): container finished" podID="bbad3a30-e11d-4ae8-9c42-e06b6382c6de" containerID="31d9e486da5aa706768e022e398d969ef41f15c9db5b579c83d50ae160db05a7" exitCode=0 Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.718104 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nmvpf" event={"ID":"bbad3a30-e11d-4ae8-9c42-e06b6382c6de","Type":"ContainerDied","Data":"31d9e486da5aa706768e022e398d969ef41f15c9db5b579c83d50ae160db05a7"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.719850 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cg94r" event={"ID":"2b68551f-119d-4d84-9c91-20e013018b7a","Type":"ContainerStarted","Data":"6aa60721dd7c09b05e3a663482308f5a6da188370cc19651da9a73a40e00696f"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.722076 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-c4mg2" event={"ID":"e7333f8a-0a54-4dec-8e7a-c7a648d2a841","Type":"ContainerStarted","Data":"d22be9ecfb9cc0389dd0f2e64dbdb2f980e40813484563a0a652ad657fd8f5b7"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.722136 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-c4mg2" event={"ID":"e7333f8a-0a54-4dec-8e7a-c7a648d2a841","Type":"ContainerStarted","Data":"ea92769dda2095af7844e2ac2e1dec699541d4477733f6b0a3693467b5238a7b"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.723897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e8b1-account-create-update-8rlww" event={"ID":"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846","Type":"ContainerStarted","Data":"e068687fbbe8ba2bc884327a323113a2f9b397134b3783cc71145217f0aced72"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.723949 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e8b1-account-create-update-8rlww" event={"ID":"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846","Type":"ContainerStarted","Data":"b7b0b5ce95f1f8f79fe79dcceda26c46edd68768fdca8bfa61a8c002836d8e7e"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.725721 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a648-account-create-update-qhx8z" event={"ID":"752279e5-88ff-469d-a4db-2942659c7e24","Type":"ContainerStarted","Data":"fcb62876ceaa2921dde5172c985a61fd201c04281f9d06dbb383e8128d91c935"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.725760 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a648-account-create-update-qhx8z" event={"ID":"752279e5-88ff-469d-a4db-2942659c7e24","Type":"ContainerStarted","Data":"71e0dffca8f1bcda6eb557e7af38812a722313603f4af46f0a96549446e9a419"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.728666 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dpr42" event={"ID":"af586fb2-38ff-4e17-86bc-a7793cb3ac45","Type":"ContainerStarted","Data":"d569280ad66a5087c9e0aa7b8abe04a7d97361bee2ca7b7c30646e77734ba51d"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.728738 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dpr42" event={"ID":"af586fb2-38ff-4e17-86bc-a7793cb3ac45","Type":"ContainerStarted","Data":"44aa414f58e36e4ce3bdd0cdcc25ef0840eb5c07319ac311b07bd27876572c79"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.730493 4772 generic.go:334] "Generic (PLEG): container finished" podID="ef900211-2a44-498c-adb6-fec1abcba5ec" containerID="37cb21cfa353006443b3a1e31571db32c636cbf5e0c7a880cb766a2b91769826" exitCode=0 Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.730551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4c0e-account-create-update-w9dkg" event={"ID":"ef900211-2a44-498c-adb6-fec1abcba5ec","Type":"ContainerDied","Data":"37cb21cfa353006443b3a1e31571db32c636cbf5e0c7a880cb766a2b91769826"} Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.765853 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-cg94r" podStartSLOduration=2.765830132 podStartE2EDuration="2.765830132s" podCreationTimestamp="2026-01-27 15:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:38.75848066 +0000 UTC m=+1124.739089768" watchObservedRunningTime="2026-01-27 15:25:38.765830132 +0000 UTC m=+1124.746439230" Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.778687 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-dpr42" podStartSLOduration=1.778661981 podStartE2EDuration="1.778661981s" podCreationTimestamp="2026-01-27 15:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:38.774770559 +0000 UTC m=+1124.755379657" watchObservedRunningTime="2026-01-27 15:25:38.778661981 +0000 UTC m=+1124.759271079" Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.813639 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-e8b1-account-create-update-8rlww" podStartSLOduration=2.813615948 podStartE2EDuration="2.813615948s" podCreationTimestamp="2026-01-27 15:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:38.810433447 +0000 UTC m=+1124.791042555" watchObservedRunningTime="2026-01-27 15:25:38.813615948 +0000 UTC m=+1124.794225046" Jan 27 15:25:38 crc kubenswrapper[4772]: I0127 15:25:38.817399 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-a648-account-create-update-qhx8z" podStartSLOduration=1.8173877470000002 podStartE2EDuration="1.817387747s" podCreationTimestamp="2026-01-27 15:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:38.796878626 +0000 UTC m=+1124.777487724" watchObservedRunningTime="2026-01-27 15:25:38.817387747 +0000 UTC m=+1124.797996845" Jan 27 15:25:39 crc kubenswrapper[4772]: I0127 15:25:39.740458 4772 generic.go:334] "Generic (PLEG): container finished" podID="af586fb2-38ff-4e17-86bc-a7793cb3ac45" containerID="d569280ad66a5087c9e0aa7b8abe04a7d97361bee2ca7b7c30646e77734ba51d" exitCode=0 Jan 27 15:25:39 crc kubenswrapper[4772]: I0127 15:25:39.740532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dpr42" event={"ID":"af586fb2-38ff-4e17-86bc-a7793cb3ac45","Type":"ContainerDied","Data":"d569280ad66a5087c9e0aa7b8abe04a7d97361bee2ca7b7c30646e77734ba51d"} Jan 27 15:25:39 crc kubenswrapper[4772]: I0127 15:25:39.743459 4772 generic.go:334] "Generic (PLEG): container finished" podID="2b68551f-119d-4d84-9c91-20e013018b7a" containerID="6aa60721dd7c09b05e3a663482308f5a6da188370cc19651da9a73a40e00696f" exitCode=0 Jan 27 15:25:39 crc kubenswrapper[4772]: I0127 15:25:39.743508 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cg94r" event={"ID":"2b68551f-119d-4d84-9c91-20e013018b7a","Type":"ContainerDied","Data":"6aa60721dd7c09b05e3a663482308f5a6da188370cc19651da9a73a40e00696f"} Jan 27 15:25:39 crc kubenswrapper[4772]: I0127 15:25:39.745072 4772 generic.go:334] "Generic (PLEG): container finished" podID="e7333f8a-0a54-4dec-8e7a-c7a648d2a841" containerID="d22be9ecfb9cc0389dd0f2e64dbdb2f980e40813484563a0a652ad657fd8f5b7" exitCode=0 Jan 27 15:25:39 crc kubenswrapper[4772]: I0127 15:25:39.745248 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-c4mg2" event={"ID":"e7333f8a-0a54-4dec-8e7a-c7a648d2a841","Type":"ContainerDied","Data":"d22be9ecfb9cc0389dd0f2e64dbdb2f980e40813484563a0a652ad657fd8f5b7"} Jan 27 15:25:39 crc kubenswrapper[4772]: E0127 15:25:39.782984 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf586fb2_38ff_4e17_86bc_a7793cb3ac45.slice/crio-d569280ad66a5087c9e0aa7b8abe04a7d97361bee2ca7b7c30646e77734ba51d.scope\": RecentStats: unable to find data in memory cache]" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.332255 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.339274 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.515290 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zzs9\" (UniqueName: \"kubernetes.io/projected/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-kube-api-access-5zzs9\") pod \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.515505 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzq4h\" (UniqueName: \"kubernetes.io/projected/ef900211-2a44-498c-adb6-fec1abcba5ec-kube-api-access-kzq4h\") pod \"ef900211-2a44-498c-adb6-fec1abcba5ec\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.515607 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef900211-2a44-498c-adb6-fec1abcba5ec-operator-scripts\") pod \"ef900211-2a44-498c-adb6-fec1abcba5ec\" (UID: \"ef900211-2a44-498c-adb6-fec1abcba5ec\") " Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.515645 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-operator-scripts\") pod \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\" (UID: \"bbad3a30-e11d-4ae8-9c42-e06b6382c6de\") " Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.516473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbad3a30-e11d-4ae8-9c42-e06b6382c6de" (UID: "bbad3a30-e11d-4ae8-9c42-e06b6382c6de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.516560 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef900211-2a44-498c-adb6-fec1abcba5ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef900211-2a44-498c-adb6-fec1abcba5ec" (UID: "ef900211-2a44-498c-adb6-fec1abcba5ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.521695 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-kube-api-access-5zzs9" (OuterVolumeSpecName: "kube-api-access-5zzs9") pod "bbad3a30-e11d-4ae8-9c42-e06b6382c6de" (UID: "bbad3a30-e11d-4ae8-9c42-e06b6382c6de"). InnerVolumeSpecName "kube-api-access-5zzs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.522338 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef900211-2a44-498c-adb6-fec1abcba5ec-kube-api-access-kzq4h" (OuterVolumeSpecName: "kube-api-access-kzq4h") pod "ef900211-2a44-498c-adb6-fec1abcba5ec" (UID: "ef900211-2a44-498c-adb6-fec1abcba5ec"). InnerVolumeSpecName "kube-api-access-kzq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.618339 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzq4h\" (UniqueName: \"kubernetes.io/projected/ef900211-2a44-498c-adb6-fec1abcba5ec-kube-api-access-kzq4h\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.618380 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef900211-2a44-498c-adb6-fec1abcba5ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.618398 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.618412 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zzs9\" (UniqueName: \"kubernetes.io/projected/bbad3a30-e11d-4ae8-9c42-e06b6382c6de-kube-api-access-5zzs9\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.759123 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4c0e-account-create-update-w9dkg" event={"ID":"ef900211-2a44-498c-adb6-fec1abcba5ec","Type":"ContainerDied","Data":"04dbba6fb62087c40aa12d14c3493253436c9c1687fdba32ecfb0a887e347e49"} Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.759194 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04dbba6fb62087c40aa12d14c3493253436c9c1687fdba32ecfb0a887e347e49" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.760254 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-w9dkg" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.761420 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-nmvpf" event={"ID":"bbad3a30-e11d-4ae8-9c42-e06b6382c6de","Type":"ContainerDied","Data":"8df1e3e2195f10d57dae3b8ddab8a9f5430c67dd9cc313305d8caf2dd324bd9e"} Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.761429 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-nmvpf" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.761441 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df1e3e2195f10d57dae3b8ddab8a9f5430c67dd9cc313305d8caf2dd324bd9e" Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.762856 4772 generic.go:334] "Generic (PLEG): container finished" podID="9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" containerID="e068687fbbe8ba2bc884327a323113a2f9b397134b3783cc71145217f0aced72" exitCode=0 Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.762945 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e8b1-account-create-update-8rlww" event={"ID":"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846","Type":"ContainerDied","Data":"e068687fbbe8ba2bc884327a323113a2f9b397134b3783cc71145217f0aced72"} Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.766127 4772 generic.go:334] "Generic (PLEG): container finished" podID="752279e5-88ff-469d-a4db-2942659c7e24" containerID="fcb62876ceaa2921dde5172c985a61fd201c04281f9d06dbb383e8128d91c935" exitCode=0 Jan 27 15:25:40 crc kubenswrapper[4772]: I0127 15:25:40.766257 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a648-account-create-update-qhx8z" event={"ID":"752279e5-88ff-469d-a4db-2942659c7e24","Type":"ContainerDied","Data":"fcb62876ceaa2921dde5172c985a61fd201c04281f9d06dbb383e8128d91c935"} Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.040124 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dpr42" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.232373 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww95\" (UniqueName: \"kubernetes.io/projected/af586fb2-38ff-4e17-86bc-a7793cb3ac45-kube-api-access-6ww95\") pod \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.232723 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af586fb2-38ff-4e17-86bc-a7793cb3ac45-operator-scripts\") pod \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\" (UID: \"af586fb2-38ff-4e17-86bc-a7793cb3ac45\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.233290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af586fb2-38ff-4e17-86bc-a7793cb3ac45-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af586fb2-38ff-4e17-86bc-a7793cb3ac45" (UID: "af586fb2-38ff-4e17-86bc-a7793cb3ac45"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.236661 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af586fb2-38ff-4e17-86bc-a7793cb3ac45-kube-api-access-6ww95" (OuterVolumeSpecName: "kube-api-access-6ww95") pod "af586fb2-38ff-4e17-86bc-a7793cb3ac45" (UID: "af586fb2-38ff-4e17-86bc-a7793cb3ac45"). InnerVolumeSpecName "kube-api-access-6ww95". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.283663 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cg94r" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.288713 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.334297 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af586fb2-38ff-4e17-86bc-a7793cb3ac45-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.334340 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ww95\" (UniqueName: \"kubernetes.io/projected/af586fb2-38ff-4e17-86bc-a7793cb3ac45-kube-api-access-6ww95\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435631 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-scripts\") pod \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435712 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pd4n\" (UniqueName: \"kubernetes.io/projected/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-kube-api-access-2pd4n\") pod \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435729 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run-ovn\") pod \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435755 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2fg6\" (UniqueName: \"kubernetes.io/projected/2b68551f-119d-4d84-9c91-20e013018b7a-kube-api-access-s2fg6\") pod \"2b68551f-119d-4d84-9c91-20e013018b7a\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435813 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run\") pod \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435892 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-additional-scripts\") pod \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435915 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-log-ovn\") pod \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\" (UID: \"e7333f8a-0a54-4dec-8e7a-c7a648d2a841\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435947 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b68551f-119d-4d84-9c91-20e013018b7a-operator-scripts\") pod \"2b68551f-119d-4d84-9c91-20e013018b7a\" (UID: \"2b68551f-119d-4d84-9c91-20e013018b7a\") " Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435944 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e7333f8a-0a54-4dec-8e7a-c7a648d2a841" (UID: "e7333f8a-0a54-4dec-8e7a-c7a648d2a841"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.435975 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run" (OuterVolumeSpecName: "var-run") pod "e7333f8a-0a54-4dec-8e7a-c7a648d2a841" (UID: "e7333f8a-0a54-4dec-8e7a-c7a648d2a841"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436077 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e7333f8a-0a54-4dec-8e7a-c7a648d2a841" (UID: "e7333f8a-0a54-4dec-8e7a-c7a648d2a841"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436417 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b68551f-119d-4d84-9c91-20e013018b7a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b68551f-119d-4d84-9c91-20e013018b7a" (UID: "2b68551f-119d-4d84-9c91-20e013018b7a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436609 4772 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436627 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436635 4772 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436644 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b68551f-119d-4d84-9c91-20e013018b7a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.436796 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e7333f8a-0a54-4dec-8e7a-c7a648d2a841" (UID: "e7333f8a-0a54-4dec-8e7a-c7a648d2a841"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.437085 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-scripts" (OuterVolumeSpecName: "scripts") pod "e7333f8a-0a54-4dec-8e7a-c7a648d2a841" (UID: "e7333f8a-0a54-4dec-8e7a-c7a648d2a841"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.444414 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-kube-api-access-2pd4n" (OuterVolumeSpecName: "kube-api-access-2pd4n") pod "e7333f8a-0a54-4dec-8e7a-c7a648d2a841" (UID: "e7333f8a-0a54-4dec-8e7a-c7a648d2a841"). InnerVolumeSpecName "kube-api-access-2pd4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.444563 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b68551f-119d-4d84-9c91-20e013018b7a-kube-api-access-s2fg6" (OuterVolumeSpecName: "kube-api-access-s2fg6") pod "2b68551f-119d-4d84-9c91-20e013018b7a" (UID: "2b68551f-119d-4d84-9c91-20e013018b7a"). InnerVolumeSpecName "kube-api-access-s2fg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.538554 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2fg6\" (UniqueName: \"kubernetes.io/projected/2b68551f-119d-4d84-9c91-20e013018b7a-kube-api-access-s2fg6\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.538598 4772 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.538616 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.538628 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pd4n\" (UniqueName: \"kubernetes.io/projected/e7333f8a-0a54-4dec-8e7a-c7a648d2a841-kube-api-access-2pd4n\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.774733 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dpr42" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.774726 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dpr42" event={"ID":"af586fb2-38ff-4e17-86bc-a7793cb3ac45","Type":"ContainerDied","Data":"44aa414f58e36e4ce3bdd0cdcc25ef0840eb5c07319ac311b07bd27876572c79"} Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.774842 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44aa414f58e36e4ce3bdd0cdcc25ef0840eb5c07319ac311b07bd27876572c79" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.776214 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cg94r" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.776319 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cg94r" event={"ID":"2b68551f-119d-4d84-9c91-20e013018b7a","Type":"ContainerDied","Data":"69a5b31d0e44865250d8bac44f85dd2b9adb78c9d00444c5fa3b2797795ff8b2"} Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.776391 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69a5b31d0e44865250d8bac44f85dd2b9adb78c9d00444c5fa3b2797795ff8b2" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.777693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-c4mg2" event={"ID":"e7333f8a-0a54-4dec-8e7a-c7a648d2a841","Type":"ContainerDied","Data":"ea92769dda2095af7844e2ac2e1dec699541d4477733f6b0a3693467b5238a7b"} Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.777737 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea92769dda2095af7844e2ac2e1dec699541d4477733f6b0a3693467b5238a7b" Jan 27 15:25:41 crc kubenswrapper[4772]: I0127 15:25:41.777844 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-c4mg2" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.054658 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.058139 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.058197 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.127811 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-gxjzh" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.147340 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.257298 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmjw4\" (UniqueName: \"kubernetes.io/projected/752279e5-88ff-469d-a4db-2942659c7e24-kube-api-access-vmjw4\") pod \"752279e5-88ff-469d-a4db-2942659c7e24\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.257368 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752279e5-88ff-469d-a4db-2942659c7e24-operator-scripts\") pod \"752279e5-88ff-469d-a4db-2942659c7e24\" (UID: \"752279e5-88ff-469d-a4db-2942659c7e24\") " Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.257561 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-operator-scripts\") pod \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.257681 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pssdb\" (UniqueName: \"kubernetes.io/projected/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-kube-api-access-pssdb\") pod \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\" (UID: \"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846\") " Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.258071 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752279e5-88ff-469d-a4db-2942659c7e24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "752279e5-88ff-469d-a4db-2942659c7e24" (UID: "752279e5-88ff-469d-a4db-2942659c7e24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.258101 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" (UID: "9cbda9cc-3ec5-4193-a7fb-ff06bdd20846"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.258437 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/752279e5-88ff-469d-a4db-2942659c7e24-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.258459 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.262108 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752279e5-88ff-469d-a4db-2942659c7e24-kube-api-access-vmjw4" (OuterVolumeSpecName: "kube-api-access-vmjw4") pod "752279e5-88ff-469d-a4db-2942659c7e24" (UID: "752279e5-88ff-469d-a4db-2942659c7e24"). InnerVolumeSpecName "kube-api-access-vmjw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.262236 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-kube-api-access-pssdb" (OuterVolumeSpecName: "kube-api-access-pssdb") pod "9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" (UID: "9cbda9cc-3ec5-4193-a7fb-ff06bdd20846"). InnerVolumeSpecName "kube-api-access-pssdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.365237 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pssdb\" (UniqueName: \"kubernetes.io/projected/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846-kube-api-access-pssdb\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.365275 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmjw4\" (UniqueName: \"kubernetes.io/projected/752279e5-88ff-469d-a4db-2942659c7e24-kube-api-access-vmjw4\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.397409 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gxjzh-config-c4mg2"] Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.407136 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gxjzh-config-c4mg2"] Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.543661 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-gxjzh-config-gzr9r"] Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544016 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544034 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544053 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752279e5-88ff-469d-a4db-2942659c7e24" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544058 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="752279e5-88ff-469d-a4db-2942659c7e24" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544070 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef900211-2a44-498c-adb6-fec1abcba5ec" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544077 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef900211-2a44-498c-adb6-fec1abcba5ec" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544087 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbad3a30-e11d-4ae8-9c42-e06b6382c6de" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544093 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbad3a30-e11d-4ae8-9c42-e06b6382c6de" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544103 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b68551f-119d-4d84-9c91-20e013018b7a" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544108 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b68551f-119d-4d84-9c91-20e013018b7a" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544121 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af586fb2-38ff-4e17-86bc-a7793cb3ac45" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544128 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="af586fb2-38ff-4e17-86bc-a7793cb3ac45" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: E0127 15:25:42.544146 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7333f8a-0a54-4dec-8e7a-c7a648d2a841" containerName="ovn-config" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544152 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7333f8a-0a54-4dec-8e7a-c7a648d2a841" containerName="ovn-config" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544311 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b68551f-119d-4d84-9c91-20e013018b7a" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544324 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="752279e5-88ff-469d-a4db-2942659c7e24" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544332 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7333f8a-0a54-4dec-8e7a-c7a648d2a841" containerName="ovn-config" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544346 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef900211-2a44-498c-adb6-fec1abcba5ec" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544357 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" containerName="mariadb-account-create-update" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544367 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbad3a30-e11d-4ae8-9c42-e06b6382c6de" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544376 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="af586fb2-38ff-4e17-86bc-a7793cb3ac45" containerName="mariadb-database-create" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.544882 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.547138 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.554095 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gxjzh-config-gzr9r"] Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.572871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbxmh\" (UniqueName: \"kubernetes.io/projected/b287920c-19e8-47a5-9644-a1651f4c39e9-kube-api-access-xbxmh\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.572943 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-scripts\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.572993 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-additional-scripts\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.573061 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.573376 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run-ovn\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.573415 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-log-ovn\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.674324 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7333f8a-0a54-4dec-8e7a-c7a648d2a841" path="/var/lib/kubelet/pods/e7333f8a-0a54-4dec-8e7a-c7a648d2a841/volumes" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.674816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run-ovn\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.674858 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-log-ovn\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.674944 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbxmh\" (UniqueName: \"kubernetes.io/projected/b287920c-19e8-47a5-9644-a1651f4c39e9-kube-api-access-xbxmh\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.674978 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-scripts\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.675009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-additional-scripts\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.675046 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.675368 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run-ovn\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.675408 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.675490 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-log-ovn\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.676133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-additional-scripts\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.678295 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-scripts\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.704375 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbxmh\" (UniqueName: \"kubernetes.io/projected/b287920c-19e8-47a5-9644-a1651f4c39e9-kube-api-access-xbxmh\") pod \"ovn-controller-gxjzh-config-gzr9r\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.786032 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e8b1-account-create-update-8rlww" event={"ID":"9cbda9cc-3ec5-4193-a7fb-ff06bdd20846","Type":"ContainerDied","Data":"b7b0b5ce95f1f8f79fe79dcceda26c46edd68768fdca8bfa61a8c002836d8e7e"} Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.786070 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7b0b5ce95f1f8f79fe79dcceda26c46edd68768fdca8bfa61a8c002836d8e7e" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.786245 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e8b1-account-create-update-8rlww" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.787463 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a648-account-create-update-qhx8z" event={"ID":"752279e5-88ff-469d-a4db-2942659c7e24","Type":"ContainerDied","Data":"71e0dffca8f1bcda6eb557e7af38812a722313603f4af46f0a96549446e9a419"} Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.787485 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71e0dffca8f1bcda6eb557e7af38812a722313603f4af46f0a96549446e9a419" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.787564 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a648-account-create-update-qhx8z" Jan 27 15:25:42 crc kubenswrapper[4772]: I0127 15:25:42.873050 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:43 crc kubenswrapper[4772]: I0127 15:25:43.311230 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-gxjzh-config-gzr9r"] Jan 27 15:25:43 crc kubenswrapper[4772]: W0127 15:25:43.311652 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb287920c_19e8_47a5_9644_a1651f4c39e9.slice/crio-ce6f9ec54fe9d90af32886adae36e9717f225263d8d3fd44e860708fc18d9938 WatchSource:0}: Error finding container ce6f9ec54fe9d90af32886adae36e9717f225263d8d3fd44e860708fc18d9938: Status 404 returned error can't find the container with id ce6f9ec54fe9d90af32886adae36e9717f225263d8d3fd44e860708fc18d9938 Jan 27 15:25:43 crc kubenswrapper[4772]: I0127 15:25:43.802091 4772 generic.go:334] "Generic (PLEG): container finished" podID="b287920c-19e8-47a5-9644-a1651f4c39e9" containerID="666a2855e8df449d0b2a9f22d64efe41fc16e80a56e57924cea7c6f56eb00af0" exitCode=0 Jan 27 15:25:43 crc kubenswrapper[4772]: I0127 15:25:43.802192 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-gzr9r" event={"ID":"b287920c-19e8-47a5-9644-a1651f4c39e9","Type":"ContainerDied","Data":"666a2855e8df449d0b2a9f22d64efe41fc16e80a56e57924cea7c6f56eb00af0"} Jan 27 15:25:43 crc kubenswrapper[4772]: I0127 15:25:43.802226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-gzr9r" event={"ID":"b287920c-19e8-47a5-9644-a1651f4c39e9","Type":"ContainerStarted","Data":"ce6f9ec54fe9d90af32886adae36e9717f225263d8d3fd44e860708fc18d9938"} Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.326293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.379553 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tltm6"] Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.379853 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-tltm6" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerName="dnsmasq-dns" containerID="cri-o://d1b5117c10f9331477f591f10a624b08ae6968087cc1bb15580ee055f80a719c" gracePeriod=10 Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.811866 4772 generic.go:334] "Generic (PLEG): container finished" podID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerID="d1b5117c10f9331477f591f10a624b08ae6968087cc1bb15580ee055f80a719c" exitCode=0 Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.812052 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tltm6" event={"ID":"01d2ace8-4fbb-4f53-aa31-7557dbaabcce","Type":"ContainerDied","Data":"d1b5117c10f9331477f591f10a624b08ae6968087cc1bb15580ee055f80a719c"} Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.812245 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-tltm6" event={"ID":"01d2ace8-4fbb-4f53-aa31-7557dbaabcce","Type":"ContainerDied","Data":"67ecd74afacd326820a12dc1cfdc76a179790d7bc04cff09eef9f1e0a03e5d5e"} Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.812261 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ecd74afacd326820a12dc1cfdc76a179790d7bc04cff09eef9f1e0a03e5d5e" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.826825 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.916561 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-sb\") pod \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.916616 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-config\") pod \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.916658 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-nb\") pod \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.916676 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2pzt\" (UniqueName: \"kubernetes.io/projected/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-kube-api-access-p2pzt\") pod \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.916697 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-dns-svc\") pod \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\" (UID: \"01d2ace8-4fbb-4f53-aa31-7557dbaabcce\") " Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.936127 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-kube-api-access-p2pzt" (OuterVolumeSpecName: "kube-api-access-p2pzt") pod "01d2ace8-4fbb-4f53-aa31-7557dbaabcce" (UID: "01d2ace8-4fbb-4f53-aa31-7557dbaabcce"). InnerVolumeSpecName "kube-api-access-p2pzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.964119 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "01d2ace8-4fbb-4f53-aa31-7557dbaabcce" (UID: "01d2ace8-4fbb-4f53-aa31-7557dbaabcce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.977707 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-config" (OuterVolumeSpecName: "config") pod "01d2ace8-4fbb-4f53-aa31-7557dbaabcce" (UID: "01d2ace8-4fbb-4f53-aa31-7557dbaabcce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.984984 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "01d2ace8-4fbb-4f53-aa31-7557dbaabcce" (UID: "01d2ace8-4fbb-4f53-aa31-7557dbaabcce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:44 crc kubenswrapper[4772]: I0127 15:25:44.986387 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01d2ace8-4fbb-4f53-aa31-7557dbaabcce" (UID: "01d2ace8-4fbb-4f53-aa31-7557dbaabcce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.018303 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.018335 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.018346 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2pzt\" (UniqueName: \"kubernetes.io/projected/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-kube-api-access-p2pzt\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.018356 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.018365 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/01d2ace8-4fbb-4f53-aa31-7557dbaabcce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.053537 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.119987 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-additional-scripts\") pod \"b287920c-19e8-47a5-9644-a1651f4c39e9\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120076 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run\") pod \"b287920c-19e8-47a5-9644-a1651f4c39e9\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120184 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-scripts\") pod \"b287920c-19e8-47a5-9644-a1651f4c39e9\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120245 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-log-ovn\") pod \"b287920c-19e8-47a5-9644-a1651f4c39e9\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120301 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbxmh\" (UniqueName: \"kubernetes.io/projected/b287920c-19e8-47a5-9644-a1651f4c39e9-kube-api-access-xbxmh\") pod \"b287920c-19e8-47a5-9644-a1651f4c39e9\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run-ovn\") pod \"b287920c-19e8-47a5-9644-a1651f4c39e9\" (UID: \"b287920c-19e8-47a5-9644-a1651f4c39e9\") " Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120678 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b287920c-19e8-47a5-9644-a1651f4c39e9" (UID: "b287920c-19e8-47a5-9644-a1651f4c39e9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120720 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run" (OuterVolumeSpecName: "var-run") pod "b287920c-19e8-47a5-9644-a1651f4c39e9" (UID: "b287920c-19e8-47a5-9644-a1651f4c39e9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b287920c-19e8-47a5-9644-a1651f4c39e9" (UID: "b287920c-19e8-47a5-9644-a1651f4c39e9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.120853 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b287920c-19e8-47a5-9644-a1651f4c39e9" (UID: "b287920c-19e8-47a5-9644-a1651f4c39e9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.121462 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-scripts" (OuterVolumeSpecName: "scripts") pod "b287920c-19e8-47a5-9644-a1651f4c39e9" (UID: "b287920c-19e8-47a5-9644-a1651f4c39e9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.124770 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b287920c-19e8-47a5-9644-a1651f4c39e9-kube-api-access-xbxmh" (OuterVolumeSpecName: "kube-api-access-xbxmh") pod "b287920c-19e8-47a5-9644-a1651f4c39e9" (UID: "b287920c-19e8-47a5-9644-a1651f4c39e9"). InnerVolumeSpecName "kube-api-access-xbxmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.222030 4772 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.222075 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbxmh\" (UniqueName: \"kubernetes.io/projected/b287920c-19e8-47a5-9644-a1651f4c39e9-kube-api-access-xbxmh\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.222090 4772 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.222103 4772 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.222116 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b287920c-19e8-47a5-9644-a1651f4c39e9-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.222127 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b287920c-19e8-47a5-9644-a1651f4c39e9-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.820910 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-tltm6" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.821262 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh-config-gzr9r" event={"ID":"b287920c-19e8-47a5-9644-a1651f4c39e9","Type":"ContainerDied","Data":"ce6f9ec54fe9d90af32886adae36e9717f225263d8d3fd44e860708fc18d9938"} Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.821335 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce6f9ec54fe9d90af32886adae36e9717f225263d8d3fd44e860708fc18d9938" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.821345 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh-config-gzr9r" Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.856891 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tltm6"] Jan 27 15:25:45 crc kubenswrapper[4772]: I0127 15:25:45.863317 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-tltm6"] Jan 27 15:25:46 crc kubenswrapper[4772]: I0127 15:25:46.120723 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gxjzh-config-gzr9r"] Jan 27 15:25:46 crc kubenswrapper[4772]: I0127 15:25:46.127707 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gxjzh-config-gzr9r"] Jan 27 15:25:46 crc kubenswrapper[4772]: I0127 15:25:46.673793 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" path="/var/lib/kubelet/pods/01d2ace8-4fbb-4f53-aa31-7557dbaabcce/volumes" Jan 27 15:25:46 crc kubenswrapper[4772]: I0127 15:25:46.674774 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b287920c-19e8-47a5-9644-a1651f4c39e9" path="/var/lib/kubelet/pods/b287920c-19e8-47a5-9644-a1651f4c39e9/volumes" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.335679 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-vdmv7"] Jan 27 15:25:47 crc kubenswrapper[4772]: E0127 15:25:47.335996 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b287920c-19e8-47a5-9644-a1651f4c39e9" containerName="ovn-config" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.336009 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b287920c-19e8-47a5-9644-a1651f4c39e9" containerName="ovn-config" Jan 27 15:25:47 crc kubenswrapper[4772]: E0127 15:25:47.336018 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerName="init" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.336023 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerName="init" Jan 27 15:25:47 crc kubenswrapper[4772]: E0127 15:25:47.336046 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerName="dnsmasq-dns" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.336051 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerName="dnsmasq-dns" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.336210 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b287920c-19e8-47a5-9644-a1651f4c39e9" containerName="ovn-config" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.336223 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d2ace8-4fbb-4f53-aa31-7557dbaabcce" containerName="dnsmasq-dns" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.336786 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.338907 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.338943 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vd4fn" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.350278 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vdmv7"] Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.473967 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-db-sync-config-data\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.474017 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-combined-ca-bundle\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.474058 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-config-data\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.474273 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkbkr\" (UniqueName: \"kubernetes.io/projected/86d0241f-ae16-400f-837c-3b43c904c91e-kube-api-access-kkbkr\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.575403 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkbkr\" (UniqueName: \"kubernetes.io/projected/86d0241f-ae16-400f-837c-3b43c904c91e-kube-api-access-kkbkr\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.575488 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-db-sync-config-data\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.575516 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-combined-ca-bundle\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.575563 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-config-data\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.580934 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-combined-ca-bundle\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.581092 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-config-data\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.584347 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-db-sync-config-data\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.590310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkbkr\" (UniqueName: \"kubernetes.io/projected/86d0241f-ae16-400f-837c-3b43c904c91e-kube-api-access-kkbkr\") pod \"glance-db-sync-vdmv7\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:47 crc kubenswrapper[4772]: I0127 15:25:47.659549 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vdmv7" Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:48.211994 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-vdmv7"] Jan 27 15:25:49 crc kubenswrapper[4772]: W0127 15:25:48.218734 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86d0241f_ae16_400f_837c_3b43c904c91e.slice/crio-a28e441b5d88b4432b8107753f5714c4987db6ac6635bc08e9f396a0b42288ec WatchSource:0}: Error finding container a28e441b5d88b4432b8107753f5714c4987db6ac6635bc08e9f396a0b42288ec: Status 404 returned error can't find the container with id a28e441b5d88b4432b8107753f5714c4987db6ac6635bc08e9f396a0b42288ec Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:48.852692 4772 generic.go:334] "Generic (PLEG): container finished" podID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerID="900401625caff4c2d87fe06884c7dcba7f46fdc58e9213b1a6cc2cf36d383e52" exitCode=0 Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:48.852792 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"508c3d5b-212a-46da-9a55-de3f35d7019b","Type":"ContainerDied","Data":"900401625caff4c2d87fe06884c7dcba7f46fdc58e9213b1a6cc2cf36d383e52"} Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:48.854916 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vdmv7" event={"ID":"86d0241f-ae16-400f-837c-3b43c904c91e","Type":"ContainerStarted","Data":"a28e441b5d88b4432b8107753f5714c4987db6ac6635bc08e9f396a0b42288ec"} Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:48.857080 4772 generic.go:334] "Generic (PLEG): container finished" podID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerID="d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594" exitCode=0 Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:48.857108 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"76fdbdb1-d48a-4cd1-8372-78887671dce8","Type":"ContainerDied","Data":"d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594"} Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:49.885919 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"76fdbdb1-d48a-4cd1-8372-78887671dce8","Type":"ContainerStarted","Data":"d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a"} Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:49.886375 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:49.888897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"508c3d5b-212a-46da-9a55-de3f35d7019b","Type":"ContainerStarted","Data":"f002759dea4443f7600e0f76f24481c1604449a5ee31bd8aa53171a2121ec4b2"} Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:49.889152 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:49.914961 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.426593695 podStartE2EDuration="1m17.914942503s" podCreationTimestamp="2026-01-27 15:24:32 +0000 UTC" firstStartedPulling="2026-01-27 15:24:34.314057243 +0000 UTC m=+1060.294666331" lastFinishedPulling="2026-01-27 15:25:14.802406041 +0000 UTC m=+1100.783015139" observedRunningTime="2026-01-27 15:25:49.912626906 +0000 UTC m=+1135.893236034" watchObservedRunningTime="2026-01-27 15:25:49.914942503 +0000 UTC m=+1135.895551601" Jan 27 15:25:49 crc kubenswrapper[4772]: I0127 15:25:49.951986 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371957.902811 podStartE2EDuration="1m18.951965269s" podCreationTimestamp="2026-01-27 15:24:31 +0000 UTC" firstStartedPulling="2026-01-27 15:24:33.501124377 +0000 UTC m=+1059.481733475" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:25:49.940310154 +0000 UTC m=+1135.920919262" watchObservedRunningTime="2026-01-27 15:25:49.951965269 +0000 UTC m=+1135.932574367" Jan 27 15:26:00 crc kubenswrapper[4772]: I0127 15:26:00.974480 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vdmv7" event={"ID":"86d0241f-ae16-400f-837c-3b43c904c91e","Type":"ContainerStarted","Data":"317ff691da5e191e31778e1d02f29484703e057687e372739fcbc9dd6f8088d2"} Jan 27 15:26:00 crc kubenswrapper[4772]: I0127 15:26:00.999894 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-vdmv7" podStartSLOduration=2.356557616 podStartE2EDuration="13.999877194s" podCreationTimestamp="2026-01-27 15:25:47 +0000 UTC" firstStartedPulling="2026-01-27 15:25:48.220746314 +0000 UTC m=+1134.201355422" lastFinishedPulling="2026-01-27 15:25:59.864065902 +0000 UTC m=+1145.844675000" observedRunningTime="2026-01-27 15:26:00.99523813 +0000 UTC m=+1146.975847228" watchObservedRunningTime="2026-01-27 15:26:00.999877194 +0000 UTC m=+1146.980486292" Jan 27 15:26:02 crc kubenswrapper[4772]: I0127 15:26:02.917152 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Jan 27 15:26:03 crc kubenswrapper[4772]: I0127 15:26:03.769453 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:26:12 crc kubenswrapper[4772]: I0127 15:26:12.059390 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:26:12 crc kubenswrapper[4772]: I0127 15:26:12.060293 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:26:12 crc kubenswrapper[4772]: I0127 15:26:12.060392 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:26:12 crc kubenswrapper[4772]: I0127 15:26:12.061883 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed9bc8d4920540552bc96f7af996996e69c893224418d74c897e7298ed107163"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:26:12 crc kubenswrapper[4772]: I0127 15:26:12.062031 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://ed9bc8d4920540552bc96f7af996996e69c893224418d74c897e7298ed107163" gracePeriod=600 Jan 27 15:26:12 crc kubenswrapper[4772]: I0127 15:26:12.917450 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.092435 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="ed9bc8d4920540552bc96f7af996996e69c893224418d74c897e7298ed107163" exitCode=0 Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.092489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"ed9bc8d4920540552bc96f7af996996e69c893224418d74c897e7298ed107163"} Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.092526 4772 scope.go:117] "RemoveContainer" containerID="c8213e4fa74445d3800c2dbcb45efc3fb34a6f40c3d5ed5845b811a51d3d8497" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.368680 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-v7ncm"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.369983 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.384269 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v7ncm"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.396653 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-xpbb6"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.397837 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.426656 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xpbb6"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.488963 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-97c3-account-create-update-xlghl"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.489862 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.491549 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.507048 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97c3-account-create-update-xlghl"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.526334 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22x22\" (UniqueName: \"kubernetes.io/projected/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-kube-api-access-22x22\") pod \"barbican-db-create-xpbb6\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.526574 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1746148-2e3f-476f-9a1f-f3656d44fb0b-operator-scripts\") pod \"cinder-db-create-v7ncm\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.526666 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrbpt\" (UniqueName: \"kubernetes.io/projected/f1746148-2e3f-476f-9a1f-f3656d44fb0b-kube-api-access-mrbpt\") pod \"cinder-db-create-v7ncm\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.526741 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-operator-scripts\") pod \"barbican-db-create-xpbb6\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.587942 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-z224f"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.589131 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.595643 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d2a3-account-create-update-hfkkb"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.596525 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.611357 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.626141 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z224f"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.628388 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22x22\" (UniqueName: \"kubernetes.io/projected/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-kube-api-access-22x22\") pod \"barbican-db-create-xpbb6\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.629317 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5rsl\" (UniqueName: \"kubernetes.io/projected/af61fb8e-e749-4872-8dc6-c590e4b9787a-kube-api-access-f5rsl\") pod \"barbican-97c3-account-create-update-xlghl\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.629582 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af61fb8e-e749-4872-8dc6-c590e4b9787a-operator-scripts\") pod \"barbican-97c3-account-create-update-xlghl\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.629739 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1746148-2e3f-476f-9a1f-f3656d44fb0b-operator-scripts\") pod \"cinder-db-create-v7ncm\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.630569 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1746148-2e3f-476f-9a1f-f3656d44fb0b-operator-scripts\") pod \"cinder-db-create-v7ncm\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.636751 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrbpt\" (UniqueName: \"kubernetes.io/projected/f1746148-2e3f-476f-9a1f-f3656d44fb0b-kube-api-access-mrbpt\") pod \"cinder-db-create-v7ncm\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.638052 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-operator-scripts\") pod \"barbican-db-create-xpbb6\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.639764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-operator-scripts\") pod \"barbican-db-create-xpbb6\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.654894 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d2a3-account-create-update-hfkkb"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.670665 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrbpt\" (UniqueName: \"kubernetes.io/projected/f1746148-2e3f-476f-9a1f-f3656d44fb0b-kube-api-access-mrbpt\") pod \"cinder-db-create-v7ncm\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.671317 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22x22\" (UniqueName: \"kubernetes.io/projected/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-kube-api-access-22x22\") pod \"barbican-db-create-xpbb6\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.684330 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kvb25"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.685300 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.686588 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.690442 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.690496 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdjsw" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.690504 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.693392 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.700914 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kvb25"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.713570 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742306 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5rsl\" (UniqueName: \"kubernetes.io/projected/af61fb8e-e749-4872-8dc6-c590e4b9787a-kube-api-access-f5rsl\") pod \"barbican-97c3-account-create-update-xlghl\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af61fb8e-e749-4872-8dc6-c590e4b9787a-operator-scripts\") pod \"barbican-97c3-account-create-update-xlghl\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgk7m\" (UniqueName: \"kubernetes.io/projected/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-kube-api-access-mgk7m\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742445 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ffgc\" (UniqueName: \"kubernetes.io/projected/b412abae-93af-4ae0-8cd8-7c0a827da4b3-kube-api-access-9ffgc\") pod \"neutron-db-create-z224f\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742481 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-combined-ca-bundle\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742535 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b412abae-93af-4ae0-8cd8-7c0a827da4b3-operator-scripts\") pod \"neutron-db-create-z224f\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742586 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54b2036-d943-4f0d-b1c4-8a47dfab5099-operator-scripts\") pod \"cinder-d2a3-account-create-update-hfkkb\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742624 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhpb\" (UniqueName: \"kubernetes.io/projected/c54b2036-d943-4f0d-b1c4-8a47dfab5099-kube-api-access-rfhpb\") pod \"cinder-d2a3-account-create-update-hfkkb\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.742644 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-config-data\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.744232 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af61fb8e-e749-4872-8dc6-c590e4b9787a-operator-scripts\") pod \"barbican-97c3-account-create-update-xlghl\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.766790 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5rsl\" (UniqueName: \"kubernetes.io/projected/af61fb8e-e749-4872-8dc6-c590e4b9787a-kube-api-access-f5rsl\") pod \"barbican-97c3-account-create-update-xlghl\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.798023 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ed9a-account-create-update-b7pnl"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.799296 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.804445 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.804987 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.823206 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ed9a-account-create-update-b7pnl"] Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845595 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ffgc\" (UniqueName: \"kubernetes.io/projected/b412abae-93af-4ae0-8cd8-7c0a827da4b3-kube-api-access-9ffgc\") pod \"neutron-db-create-z224f\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-combined-ca-bundle\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845742 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b412abae-93af-4ae0-8cd8-7c0a827da4b3-operator-scripts\") pod \"neutron-db-create-z224f\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54b2036-d943-4f0d-b1c4-8a47dfab5099-operator-scripts\") pod \"cinder-d2a3-account-create-update-hfkkb\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845855 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b163780a-6dd7-4232-b0da-a22f18d36fcc-operator-scripts\") pod \"neutron-ed9a-account-create-update-b7pnl\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845883 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhpb\" (UniqueName: \"kubernetes.io/projected/c54b2036-d943-4f0d-b1c4-8a47dfab5099-kube-api-access-rfhpb\") pod \"cinder-d2a3-account-create-update-hfkkb\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t54t7\" (UniqueName: \"kubernetes.io/projected/b163780a-6dd7-4232-b0da-a22f18d36fcc-kube-api-access-t54t7\") pod \"neutron-ed9a-account-create-update-b7pnl\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.845943 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-config-data\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.846040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgk7m\" (UniqueName: \"kubernetes.io/projected/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-kube-api-access-mgk7m\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.850954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54b2036-d943-4f0d-b1c4-8a47dfab5099-operator-scripts\") pod \"cinder-d2a3-account-create-update-hfkkb\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.852021 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-config-data\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.856373 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-combined-ca-bundle\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.856762 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b412abae-93af-4ae0-8cd8-7c0a827da4b3-operator-scripts\") pod \"neutron-db-create-z224f\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.870716 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhpb\" (UniqueName: \"kubernetes.io/projected/c54b2036-d943-4f0d-b1c4-8a47dfab5099-kube-api-access-rfhpb\") pod \"cinder-d2a3-account-create-update-hfkkb\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.871339 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ffgc\" (UniqueName: \"kubernetes.io/projected/b412abae-93af-4ae0-8cd8-7c0a827da4b3-kube-api-access-9ffgc\") pod \"neutron-db-create-z224f\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.895910 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgk7m\" (UniqueName: \"kubernetes.io/projected/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-kube-api-access-mgk7m\") pod \"keystone-db-sync-kvb25\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.921535 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.921576 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z224f" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.947253 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b163780a-6dd7-4232-b0da-a22f18d36fcc-operator-scripts\") pod \"neutron-ed9a-account-create-update-b7pnl\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.947301 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t54t7\" (UniqueName: \"kubernetes.io/projected/b163780a-6dd7-4232-b0da-a22f18d36fcc-kube-api-access-t54t7\") pod \"neutron-ed9a-account-create-update-b7pnl\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:13 crc kubenswrapper[4772]: I0127 15:26:13.948409 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b163780a-6dd7-4232-b0da-a22f18d36fcc-operator-scripts\") pod \"neutron-ed9a-account-create-update-b7pnl\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.043787 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t54t7\" (UniqueName: \"kubernetes.io/projected/b163780a-6dd7-4232-b0da-a22f18d36fcc-kube-api-access-t54t7\") pod \"neutron-ed9a-account-create-update-b7pnl\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.114760 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"1d1c45659af37dbb5fcad6152d119ca4f804c58006a54555795ff000f3b7aea9"} Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.130972 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.157550 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.202690 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-v7ncm"] Jan 27 15:26:14 crc kubenswrapper[4772]: W0127 15:26:14.227122 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1746148_2e3f_476f_9a1f_f3656d44fb0b.slice/crio-d7520f953d62324f0f59051719778147c98517e633ea0fe6fa2fd5195a4cadf3 WatchSource:0}: Error finding container d7520f953d62324f0f59051719778147c98517e633ea0fe6fa2fd5195a4cadf3: Status 404 returned error can't find the container with id d7520f953d62324f0f59051719778147c98517e633ea0fe6fa2fd5195a4cadf3 Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.386419 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-xpbb6"] Jan 27 15:26:14 crc kubenswrapper[4772]: W0127 15:26:14.392664 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf61fb8e_e749_4872_8dc6_c590e4b9787a.slice/crio-4fc5b98afeb27a7f560b6faf0b64da965c312fa1d6668ea8e195f9bf3cf774cd WatchSource:0}: Error finding container 4fc5b98afeb27a7f560b6faf0b64da965c312fa1d6668ea8e195f9bf3cf774cd: Status 404 returned error can't find the container with id 4fc5b98afeb27a7f560b6faf0b64da965c312fa1d6668ea8e195f9bf3cf774cd Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.394063 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97c3-account-create-update-xlghl"] Jan 27 15:26:14 crc kubenswrapper[4772]: W0127 15:26:14.398873 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee9c9aa3_63e7_49ae_b3f3_f9bc0802f112.slice/crio-7078456546f57835b75bdc01276c8dc7b83b93913928c27cd8b38ff3cf49ddaf WatchSource:0}: Error finding container 7078456546f57835b75bdc01276c8dc7b83b93913928c27cd8b38ff3cf49ddaf: Status 404 returned error can't find the container with id 7078456546f57835b75bdc01276c8dc7b83b93913928c27cd8b38ff3cf49ddaf Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.629682 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d2a3-account-create-update-hfkkb"] Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.638386 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-z224f"] Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.715112 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kvb25"] Jan 27 15:26:14 crc kubenswrapper[4772]: I0127 15:26:14.742933 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ed9a-account-create-update-b7pnl"] Jan 27 15:26:14 crc kubenswrapper[4772]: W0127 15:26:14.771487 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb163780a_6dd7_4232_b0da_a22f18d36fcc.slice/crio-e83383c104bd8b761200170a9e0bc2e126e98220c9813a9e33d886b26ab147ee WatchSource:0}: Error finding container e83383c104bd8b761200170a9e0bc2e126e98220c9813a9e33d886b26ab147ee: Status 404 returned error can't find the container with id e83383c104bd8b761200170a9e0bc2e126e98220c9813a9e33d886b26ab147ee Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.122577 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvb25" event={"ID":"57a1d71f-3b00-42c0-92c4-a29fb3d4518c","Type":"ContainerStarted","Data":"cc68b23df999b3c351ddd5132639a8352068b638f27d53a94da0ce06a1009dad"} Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.123658 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed9a-account-create-update-b7pnl" event={"ID":"b163780a-6dd7-4232-b0da-a22f18d36fcc","Type":"ContainerStarted","Data":"e83383c104bd8b761200170a9e0bc2e126e98220c9813a9e33d886b26ab147ee"} Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.124736 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97c3-account-create-update-xlghl" event={"ID":"af61fb8e-e749-4872-8dc6-c590e4b9787a","Type":"ContainerStarted","Data":"4fc5b98afeb27a7f560b6faf0b64da965c312fa1d6668ea8e195f9bf3cf774cd"} Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.125703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v7ncm" event={"ID":"f1746148-2e3f-476f-9a1f-f3656d44fb0b","Type":"ContainerStarted","Data":"d7520f953d62324f0f59051719778147c98517e633ea0fe6fa2fd5195a4cadf3"} Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.131208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2a3-account-create-update-hfkkb" event={"ID":"c54b2036-d943-4f0d-b1c4-8a47dfab5099","Type":"ContainerStarted","Data":"08d6532f77d29e47d3d345a3cbcc4de484a6e75463c90eca472f8ccd07be6a84"} Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.132502 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z224f" event={"ID":"b412abae-93af-4ae0-8cd8-7c0a827da4b3","Type":"ContainerStarted","Data":"96d15bd1f51df9f2151a057a908e64905ec96ea27be5ce6e7029e233334cd82f"} Jan 27 15:26:15 crc kubenswrapper[4772]: I0127 15:26:15.133515 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xpbb6" event={"ID":"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112","Type":"ContainerStarted","Data":"7078456546f57835b75bdc01276c8dc7b83b93913928c27cd8b38ff3cf49ddaf"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.145102 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2a3-account-create-update-hfkkb" event={"ID":"c54b2036-d943-4f0d-b1c4-8a47dfab5099","Type":"ContainerStarted","Data":"c454404cb2dabeb6539bab075b0096e5a7ba9d3726f1b7a2ce5d55b30cc778e8"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.147750 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z224f" event={"ID":"b412abae-93af-4ae0-8cd8-7c0a827da4b3","Type":"ContainerStarted","Data":"478f9eb73f50cba542d4259825587e98caddfe9513876ed4823af8b00681f571"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.150194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xpbb6" event={"ID":"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112","Type":"ContainerStarted","Data":"7cb0416a54334bdd5699afd4b64397c193035c399e5586172a360ff52cd674f9"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.152827 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed9a-account-create-update-b7pnl" event={"ID":"b163780a-6dd7-4232-b0da-a22f18d36fcc","Type":"ContainerStarted","Data":"a5ffbaeea04257a22f38554ccc4304785fadfe22ac90bb6e3544b162aab10857"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.155564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97c3-account-create-update-xlghl" event={"ID":"af61fb8e-e749-4872-8dc6-c590e4b9787a","Type":"ContainerStarted","Data":"d265eb93689c326c68ce844d36ec8e13845ff3f6cfb1ed7e88273d0cf4e91cbd"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.156706 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v7ncm" event={"ID":"f1746148-2e3f-476f-9a1f-f3656d44fb0b","Type":"ContainerStarted","Data":"597636ff183f237bb3b639ea5c67c6b5f6f29f40e362b71df3d4ec02eaa6036b"} Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.167952 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-d2a3-account-create-update-hfkkb" podStartSLOduration=3.167932458 podStartE2EDuration="3.167932458s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:16.157425536 +0000 UTC m=+1162.138034634" watchObservedRunningTime="2026-01-27 15:26:16.167932458 +0000 UTC m=+1162.148541556" Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.173847 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-xpbb6" podStartSLOduration=3.173825248 podStartE2EDuration="3.173825248s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:16.173601932 +0000 UTC m=+1162.154211040" watchObservedRunningTime="2026-01-27 15:26:16.173825248 +0000 UTC m=+1162.154434346" Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.195831 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-z224f" podStartSLOduration=3.195813932 podStartE2EDuration="3.195813932s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:16.188831201 +0000 UTC m=+1162.169440299" watchObservedRunningTime="2026-01-27 15:26:16.195813932 +0000 UTC m=+1162.176423030" Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.208747 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ed9a-account-create-update-b7pnl" podStartSLOduration=3.208728104 podStartE2EDuration="3.208728104s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:16.207526299 +0000 UTC m=+1162.188135397" watchObservedRunningTime="2026-01-27 15:26:16.208728104 +0000 UTC m=+1162.189337202" Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.225158 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-v7ncm" podStartSLOduration=3.225139677 podStartE2EDuration="3.225139677s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:16.218470434 +0000 UTC m=+1162.199079542" watchObservedRunningTime="2026-01-27 15:26:16.225139677 +0000 UTC m=+1162.205748765" Jan 27 15:26:16 crc kubenswrapper[4772]: I0127 15:26:16.253881 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-97c3-account-create-update-xlghl" podStartSLOduration=3.253866454 podStartE2EDuration="3.253866454s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:16.25164821 +0000 UTC m=+1162.232257308" watchObservedRunningTime="2026-01-27 15:26:16.253866454 +0000 UTC m=+1162.234475552" Jan 27 15:26:18 crc kubenswrapper[4772]: I0127 15:26:18.171032 4772 generic.go:334] "Generic (PLEG): container finished" podID="f1746148-2e3f-476f-9a1f-f3656d44fb0b" containerID="597636ff183f237bb3b639ea5c67c6b5f6f29f40e362b71df3d4ec02eaa6036b" exitCode=0 Jan 27 15:26:18 crc kubenswrapper[4772]: I0127 15:26:18.171498 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v7ncm" event={"ID":"f1746148-2e3f-476f-9a1f-f3656d44fb0b","Type":"ContainerDied","Data":"597636ff183f237bb3b639ea5c67c6b5f6f29f40e362b71df3d4ec02eaa6036b"} Jan 27 15:26:19 crc kubenswrapper[4772]: I0127 15:26:19.184547 4772 generic.go:334] "Generic (PLEG): container finished" podID="ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" containerID="7cb0416a54334bdd5699afd4b64397c193035c399e5586172a360ff52cd674f9" exitCode=0 Jan 27 15:26:19 crc kubenswrapper[4772]: I0127 15:26:19.184627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xpbb6" event={"ID":"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112","Type":"ContainerDied","Data":"7cb0416a54334bdd5699afd4b64397c193035c399e5586172a360ff52cd674f9"} Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.662138 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.669223 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.709967 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1746148-2e3f-476f-9a1f-f3656d44fb0b-operator-scripts\") pod \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.710017 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-operator-scripts\") pod \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.710050 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrbpt\" (UniqueName: \"kubernetes.io/projected/f1746148-2e3f-476f-9a1f-f3656d44fb0b-kube-api-access-mrbpt\") pod \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\" (UID: \"f1746148-2e3f-476f-9a1f-f3656d44fb0b\") " Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.710302 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22x22\" (UniqueName: \"kubernetes.io/projected/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-kube-api-access-22x22\") pod \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\" (UID: \"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112\") " Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.710933 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" (UID: "ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.711670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1746148-2e3f-476f-9a1f-f3656d44fb0b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1746148-2e3f-476f-9a1f-f3656d44fb0b" (UID: "f1746148-2e3f-476f-9a1f-f3656d44fb0b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.717828 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1746148-2e3f-476f-9a1f-f3656d44fb0b-kube-api-access-mrbpt" (OuterVolumeSpecName: "kube-api-access-mrbpt") pod "f1746148-2e3f-476f-9a1f-f3656d44fb0b" (UID: "f1746148-2e3f-476f-9a1f-f3656d44fb0b"). InnerVolumeSpecName "kube-api-access-mrbpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.718416 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-kube-api-access-22x22" (OuterVolumeSpecName: "kube-api-access-22x22") pod "ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" (UID: "ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112"). InnerVolumeSpecName "kube-api-access-22x22". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.812213 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22x22\" (UniqueName: \"kubernetes.io/projected/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-kube-api-access-22x22\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.812261 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1746148-2e3f-476f-9a1f-f3656d44fb0b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.812274 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:21 crc kubenswrapper[4772]: I0127 15:26:21.812288 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrbpt\" (UniqueName: \"kubernetes.io/projected/f1746148-2e3f-476f-9a1f-f3656d44fb0b-kube-api-access-mrbpt\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:22 crc kubenswrapper[4772]: I0127 15:26:22.208784 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-xpbb6" event={"ID":"ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112","Type":"ContainerDied","Data":"7078456546f57835b75bdc01276c8dc7b83b93913928c27cd8b38ff3cf49ddaf"} Jan 27 15:26:22 crc kubenswrapper[4772]: I0127 15:26:22.208809 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-xpbb6" Jan 27 15:26:22 crc kubenswrapper[4772]: I0127 15:26:22.208820 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7078456546f57835b75bdc01276c8dc7b83b93913928c27cd8b38ff3cf49ddaf" Jan 27 15:26:22 crc kubenswrapper[4772]: I0127 15:26:22.210508 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-v7ncm" event={"ID":"f1746148-2e3f-476f-9a1f-f3656d44fb0b","Type":"ContainerDied","Data":"d7520f953d62324f0f59051719778147c98517e633ea0fe6fa2fd5195a4cadf3"} Jan 27 15:26:22 crc kubenswrapper[4772]: I0127 15:26:22.210554 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-v7ncm" Jan 27 15:26:22 crc kubenswrapper[4772]: I0127 15:26:22.210554 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7520f953d62324f0f59051719778147c98517e633ea0fe6fa2fd5195a4cadf3" Jan 27 15:26:23 crc kubenswrapper[4772]: I0127 15:26:23.221313 4772 generic.go:334] "Generic (PLEG): container finished" podID="b412abae-93af-4ae0-8cd8-7c0a827da4b3" containerID="478f9eb73f50cba542d4259825587e98caddfe9513876ed4823af8b00681f571" exitCode=0 Jan 27 15:26:23 crc kubenswrapper[4772]: I0127 15:26:23.221425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z224f" event={"ID":"b412abae-93af-4ae0-8cd8-7c0a827da4b3","Type":"ContainerDied","Data":"478f9eb73f50cba542d4259825587e98caddfe9513876ed4823af8b00681f571"} Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.091725 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z224f" Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.162947 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b412abae-93af-4ae0-8cd8-7c0a827da4b3-operator-scripts\") pod \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.163033 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ffgc\" (UniqueName: \"kubernetes.io/projected/b412abae-93af-4ae0-8cd8-7c0a827da4b3-kube-api-access-9ffgc\") pod \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\" (UID: \"b412abae-93af-4ae0-8cd8-7c0a827da4b3\") " Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.163572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b412abae-93af-4ae0-8cd8-7c0a827da4b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b412abae-93af-4ae0-8cd8-7c0a827da4b3" (UID: "b412abae-93af-4ae0-8cd8-7c0a827da4b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.168416 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b412abae-93af-4ae0-8cd8-7c0a827da4b3-kube-api-access-9ffgc" (OuterVolumeSpecName: "kube-api-access-9ffgc") pod "b412abae-93af-4ae0-8cd8-7c0a827da4b3" (UID: "b412abae-93af-4ae0-8cd8-7c0a827da4b3"). InnerVolumeSpecName "kube-api-access-9ffgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.238251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-z224f" event={"ID":"b412abae-93af-4ae0-8cd8-7c0a827da4b3","Type":"ContainerDied","Data":"96d15bd1f51df9f2151a057a908e64905ec96ea27be5ce6e7029e233334cd82f"} Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.238294 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96d15bd1f51df9f2151a057a908e64905ec96ea27be5ce6e7029e233334cd82f" Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.238312 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-z224f" Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.265113 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b412abae-93af-4ae0-8cd8-7c0a827da4b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:25 crc kubenswrapper[4772]: I0127 15:26:25.265148 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ffgc\" (UniqueName: \"kubernetes.io/projected/b412abae-93af-4ae0-8cd8-7c0a827da4b3-kube-api-access-9ffgc\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:26 crc kubenswrapper[4772]: I0127 15:26:26.248925 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvb25" event={"ID":"57a1d71f-3b00-42c0-92c4-a29fb3d4518c","Type":"ContainerStarted","Data":"e1a2cafeb608c7919a88b50bf39a141cb90ef87745db78d4f8f6a94522bb8d2e"} Jan 27 15:26:26 crc kubenswrapper[4772]: I0127 15:26:26.267790 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kvb25" podStartSLOduration=2.667467401 podStartE2EDuration="13.267768321s" podCreationTimestamp="2026-01-27 15:26:13 +0000 UTC" firstStartedPulling="2026-01-27 15:26:14.733393191 +0000 UTC m=+1160.714002289" lastFinishedPulling="2026-01-27 15:26:25.333694111 +0000 UTC m=+1171.314303209" observedRunningTime="2026-01-27 15:26:26.265025792 +0000 UTC m=+1172.245634900" watchObservedRunningTime="2026-01-27 15:26:26.267768321 +0000 UTC m=+1172.248377419" Jan 27 15:26:33 crc kubenswrapper[4772]: I0127 15:26:33.317460 4772 generic.go:334] "Generic (PLEG): container finished" podID="b163780a-6dd7-4232-b0da-a22f18d36fcc" containerID="a5ffbaeea04257a22f38554ccc4304785fadfe22ac90bb6e3544b162aab10857" exitCode=0 Jan 27 15:26:33 crc kubenswrapper[4772]: I0127 15:26:33.317563 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed9a-account-create-update-b7pnl" event={"ID":"b163780a-6dd7-4232-b0da-a22f18d36fcc","Type":"ContainerDied","Data":"a5ffbaeea04257a22f38554ccc4304785fadfe22ac90bb6e3544b162aab10857"} Jan 27 15:26:33 crc kubenswrapper[4772]: I0127 15:26:33.320750 4772 generic.go:334] "Generic (PLEG): container finished" podID="af61fb8e-e749-4872-8dc6-c590e4b9787a" containerID="d265eb93689c326c68ce844d36ec8e13845ff3f6cfb1ed7e88273d0cf4e91cbd" exitCode=0 Jan 27 15:26:33 crc kubenswrapper[4772]: I0127 15:26:33.320817 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97c3-account-create-update-xlghl" event={"ID":"af61fb8e-e749-4872-8dc6-c590e4b9787a","Type":"ContainerDied","Data":"d265eb93689c326c68ce844d36ec8e13845ff3f6cfb1ed7e88273d0cf4e91cbd"} Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.778557 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.790432 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.932477 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b163780a-6dd7-4232-b0da-a22f18d36fcc-operator-scripts\") pod \"b163780a-6dd7-4232-b0da-a22f18d36fcc\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.932642 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5rsl\" (UniqueName: \"kubernetes.io/projected/af61fb8e-e749-4872-8dc6-c590e4b9787a-kube-api-access-f5rsl\") pod \"af61fb8e-e749-4872-8dc6-c590e4b9787a\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.932688 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t54t7\" (UniqueName: \"kubernetes.io/projected/b163780a-6dd7-4232-b0da-a22f18d36fcc-kube-api-access-t54t7\") pod \"b163780a-6dd7-4232-b0da-a22f18d36fcc\" (UID: \"b163780a-6dd7-4232-b0da-a22f18d36fcc\") " Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.932748 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af61fb8e-e749-4872-8dc6-c590e4b9787a-operator-scripts\") pod \"af61fb8e-e749-4872-8dc6-c590e4b9787a\" (UID: \"af61fb8e-e749-4872-8dc6-c590e4b9787a\") " Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.933394 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b163780a-6dd7-4232-b0da-a22f18d36fcc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b163780a-6dd7-4232-b0da-a22f18d36fcc" (UID: "b163780a-6dd7-4232-b0da-a22f18d36fcc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.933508 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af61fb8e-e749-4872-8dc6-c590e4b9787a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af61fb8e-e749-4872-8dc6-c590e4b9787a" (UID: "af61fb8e-e749-4872-8dc6-c590e4b9787a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.933927 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b163780a-6dd7-4232-b0da-a22f18d36fcc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.933960 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af61fb8e-e749-4872-8dc6-c590e4b9787a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.938843 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b163780a-6dd7-4232-b0da-a22f18d36fcc-kube-api-access-t54t7" (OuterVolumeSpecName: "kube-api-access-t54t7") pod "b163780a-6dd7-4232-b0da-a22f18d36fcc" (UID: "b163780a-6dd7-4232-b0da-a22f18d36fcc"). InnerVolumeSpecName "kube-api-access-t54t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:34 crc kubenswrapper[4772]: I0127 15:26:34.938970 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af61fb8e-e749-4872-8dc6-c590e4b9787a-kube-api-access-f5rsl" (OuterVolumeSpecName: "kube-api-access-f5rsl") pod "af61fb8e-e749-4872-8dc6-c590e4b9787a" (UID: "af61fb8e-e749-4872-8dc6-c590e4b9787a"). InnerVolumeSpecName "kube-api-access-f5rsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.035654 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5rsl\" (UniqueName: \"kubernetes.io/projected/af61fb8e-e749-4872-8dc6-c590e4b9787a-kube-api-access-f5rsl\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.035693 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t54t7\" (UniqueName: \"kubernetes.io/projected/b163780a-6dd7-4232-b0da-a22f18d36fcc-kube-api-access-t54t7\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.345784 4772 generic.go:334] "Generic (PLEG): container finished" podID="c54b2036-d943-4f0d-b1c4-8a47dfab5099" containerID="c454404cb2dabeb6539bab075b0096e5a7ba9d3726f1b7a2ce5d55b30cc778e8" exitCode=0 Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.345799 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2a3-account-create-update-hfkkb" event={"ID":"c54b2036-d943-4f0d-b1c4-8a47dfab5099","Type":"ContainerDied","Data":"c454404cb2dabeb6539bab075b0096e5a7ba9d3726f1b7a2ce5d55b30cc778e8"} Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.348624 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ed9a-account-create-update-b7pnl" Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.348627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ed9a-account-create-update-b7pnl" event={"ID":"b163780a-6dd7-4232-b0da-a22f18d36fcc","Type":"ContainerDied","Data":"e83383c104bd8b761200170a9e0bc2e126e98220c9813a9e33d886b26ab147ee"} Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.348684 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e83383c104bd8b761200170a9e0bc2e126e98220c9813a9e33d886b26ab147ee" Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.350414 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97c3-account-create-update-xlghl" event={"ID":"af61fb8e-e749-4872-8dc6-c590e4b9787a","Type":"ContainerDied","Data":"4fc5b98afeb27a7f560b6faf0b64da965c312fa1d6668ea8e195f9bf3cf774cd"} Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.350454 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fc5b98afeb27a7f560b6faf0b64da965c312fa1d6668ea8e195f9bf3cf774cd" Jan 27 15:26:35 crc kubenswrapper[4772]: I0127 15:26:35.350487 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-xlghl" Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.643099 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.659976 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54b2036-d943-4f0d-b1c4-8a47dfab5099-operator-scripts\") pod \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.660046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfhpb\" (UniqueName: \"kubernetes.io/projected/c54b2036-d943-4f0d-b1c4-8a47dfab5099-kube-api-access-rfhpb\") pod \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\" (UID: \"c54b2036-d943-4f0d-b1c4-8a47dfab5099\") " Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.661396 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54b2036-d943-4f0d-b1c4-8a47dfab5099-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c54b2036-d943-4f0d-b1c4-8a47dfab5099" (UID: "c54b2036-d943-4f0d-b1c4-8a47dfab5099"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.667326 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54b2036-d943-4f0d-b1c4-8a47dfab5099-kube-api-access-rfhpb" (OuterVolumeSpecName: "kube-api-access-rfhpb") pod "c54b2036-d943-4f0d-b1c4-8a47dfab5099" (UID: "c54b2036-d943-4f0d-b1c4-8a47dfab5099"). InnerVolumeSpecName "kube-api-access-rfhpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.762280 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54b2036-d943-4f0d-b1c4-8a47dfab5099-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:36 crc kubenswrapper[4772]: I0127 15:26:36.762310 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfhpb\" (UniqueName: \"kubernetes.io/projected/c54b2036-d943-4f0d-b1c4-8a47dfab5099-kube-api-access-rfhpb\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:37 crc kubenswrapper[4772]: I0127 15:26:37.371012 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d2a3-account-create-update-hfkkb" event={"ID":"c54b2036-d943-4f0d-b1c4-8a47dfab5099","Type":"ContainerDied","Data":"08d6532f77d29e47d3d345a3cbcc4de484a6e75463c90eca472f8ccd07be6a84"} Jan 27 15:26:37 crc kubenswrapper[4772]: I0127 15:26:37.371066 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08d6532f77d29e47d3d345a3cbcc4de484a6e75463c90eca472f8ccd07be6a84" Jan 27 15:26:37 crc kubenswrapper[4772]: I0127 15:26:37.371084 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d2a3-account-create-update-hfkkb" Jan 27 15:26:45 crc kubenswrapper[4772]: I0127 15:26:45.441134 4772 generic.go:334] "Generic (PLEG): container finished" podID="57a1d71f-3b00-42c0-92c4-a29fb3d4518c" containerID="e1a2cafeb608c7919a88b50bf39a141cb90ef87745db78d4f8f6a94522bb8d2e" exitCode=0 Jan 27 15:26:45 crc kubenswrapper[4772]: I0127 15:26:45.441265 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvb25" event={"ID":"57a1d71f-3b00-42c0-92c4-a29fb3d4518c","Type":"ContainerDied","Data":"e1a2cafeb608c7919a88b50bf39a141cb90ef87745db78d4f8f6a94522bb8d2e"} Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.780186 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.928376 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgk7m\" (UniqueName: \"kubernetes.io/projected/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-kube-api-access-mgk7m\") pod \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.928418 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-config-data\") pod \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.928492 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-combined-ca-bundle\") pod \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\" (UID: \"57a1d71f-3b00-42c0-92c4-a29fb3d4518c\") " Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.934270 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-kube-api-access-mgk7m" (OuterVolumeSpecName: "kube-api-access-mgk7m") pod "57a1d71f-3b00-42c0-92c4-a29fb3d4518c" (UID: "57a1d71f-3b00-42c0-92c4-a29fb3d4518c"). InnerVolumeSpecName "kube-api-access-mgk7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.963895 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57a1d71f-3b00-42c0-92c4-a29fb3d4518c" (UID: "57a1d71f-3b00-42c0-92c4-a29fb3d4518c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:26:46 crc kubenswrapper[4772]: I0127 15:26:46.979382 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-config-data" (OuterVolumeSpecName: "config-data") pod "57a1d71f-3b00-42c0-92c4-a29fb3d4518c" (UID: "57a1d71f-3b00-42c0-92c4-a29fb3d4518c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.031025 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgk7m\" (UniqueName: \"kubernetes.io/projected/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-kube-api-access-mgk7m\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.031080 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.031094 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57a1d71f-3b00-42c0-92c4-a29fb3d4518c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.458749 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kvb25" event={"ID":"57a1d71f-3b00-42c0-92c4-a29fb3d4518c","Type":"ContainerDied","Data":"cc68b23df999b3c351ddd5132639a8352068b638f27d53a94da0ce06a1009dad"} Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.458791 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc68b23df999b3c351ddd5132639a8352068b638f27d53a94da0ce06a1009dad" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.458795 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kvb25" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722443 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-x8hbr"] Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722769 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af61fb8e-e749-4872-8dc6-c590e4b9787a" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722785 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="af61fb8e-e749-4872-8dc6-c590e4b9787a" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722811 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722817 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722828 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1746148-2e3f-476f-9a1f-f3656d44fb0b" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722834 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1746148-2e3f-476f-9a1f-f3656d44fb0b" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722847 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54b2036-d943-4f0d-b1c4-8a47dfab5099" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722852 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54b2036-d943-4f0d-b1c4-8a47dfab5099" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722861 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b412abae-93af-4ae0-8cd8-7c0a827da4b3" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722868 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b412abae-93af-4ae0-8cd8-7c0a827da4b3" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722877 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a1d71f-3b00-42c0-92c4-a29fb3d4518c" containerName="keystone-db-sync" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722883 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a1d71f-3b00-42c0-92c4-a29fb3d4518c" containerName="keystone-db-sync" Jan 27 15:26:47 crc kubenswrapper[4772]: E0127 15:26:47.722900 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b163780a-6dd7-4232-b0da-a22f18d36fcc" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.722906 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b163780a-6dd7-4232-b0da-a22f18d36fcc" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723069 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723081 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a1d71f-3b00-42c0-92c4-a29fb3d4518c" containerName="keystone-db-sync" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723089 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b163780a-6dd7-4232-b0da-a22f18d36fcc" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723100 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54b2036-d943-4f0d-b1c4-8a47dfab5099" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723112 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1746148-2e3f-476f-9a1f-f3656d44fb0b" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723122 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="af61fb8e-e749-4872-8dc6-c590e4b9787a" containerName="mariadb-account-create-update" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.723132 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b412abae-93af-4ae0-8cd8-7c0a827da4b3" containerName="mariadb-database-create" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.724758 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.733933 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-x8hbr"] Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.756752 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rl8kf"] Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.759080 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.764159 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.764312 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.764510 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.764620 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.764896 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdjsw" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.772857 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rl8kf"] Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851445 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-svc\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851511 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-credential-keys\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851553 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx2xr\" (UniqueName: \"kubernetes.io/projected/34ee743a-1628-42e4-a465-0e3957cae089-kube-api-access-tx2xr\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851643 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwktg\" (UniqueName: \"kubernetes.io/projected/0851ad59-841c-4133-a043-13d2cfdb0803-kube-api-access-fwktg\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851692 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-scripts\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-config\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.851866 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-fernet-keys\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.852115 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-combined-ca-bundle\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.852217 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.852388 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-config-data\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.924246 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-v689b"] Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.925840 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.930485 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.930916 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rjm9r" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.936314 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v689b"] Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.937749 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwktg\" (UniqueName: \"kubernetes.io/projected/0851ad59-841c-4133-a043-13d2cfdb0803-kube-api-access-fwktg\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954846 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-scripts\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954873 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954916 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-config\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-fernet-keys\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954974 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-combined-ca-bundle\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.954990 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.955018 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-config-data\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.955049 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-svc\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.955068 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-credential-keys\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.955087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx2xr\" (UniqueName: \"kubernetes.io/projected/34ee743a-1628-42e4-a465-0e3957cae089-kube-api-access-tx2xr\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.957426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.959445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.960209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-config\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.961868 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-combined-ca-bundle\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.962095 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-svc\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.962718 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.964032 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-fernet-keys\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.968899 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-config-data\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.972696 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-credential-keys\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.975352 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-scripts\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.984836 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.988149 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.992933 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx2xr\" (UniqueName: \"kubernetes.io/projected/34ee743a-1628-42e4-a465-0e3957cae089-kube-api-access-tx2xr\") pod \"dnsmasq-dns-55fff446b9-x8hbr\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.993052 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:26:47 crc kubenswrapper[4772]: I0127 15:26:47.996076 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwktg\" (UniqueName: \"kubernetes.io/projected/0851ad59-841c-4133-a043-13d2cfdb0803-kube-api-access-fwktg\") pod \"keystone-bootstrap-rl8kf\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.006966 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.026186 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.052949 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.056008 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-config\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.056085 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jn9b\" (UniqueName: \"kubernetes.io/projected/b0625578-3b48-44c7-9082-174fce3a7e74-kube-api-access-9jn9b\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.056112 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-combined-ca-bundle\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.060540 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pmk27"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.061661 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.063510 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-flljj" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.063846 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.079120 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pmk27"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.079841 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.115316 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-8l85z"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.116635 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.122759 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.122992 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8nhs4" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.123203 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.131059 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8l85z"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160147 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-scripts\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jn9b\" (UniqueName: \"kubernetes.io/projected/b0625578-3b48-44c7-9082-174fce3a7e74-kube-api-access-9jn9b\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160320 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-combined-ca-bundle\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cvd2\" (UniqueName: \"kubernetes.io/projected/5a423229-06be-4934-9715-58105e1af686-kube-api-access-8cvd2\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160400 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxkw\" (UniqueName: \"kubernetes.io/projected/de415c6e-4424-49c4-bc9d-076a5b13ab4e-kube-api-access-fvxkw\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160467 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160492 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-run-httpd\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160518 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-combined-ca-bundle\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160540 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160566 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-config-data\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160591 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-config\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160621 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-log-httpd\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.160652 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-db-sync-config-data\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.166020 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-combined-ca-bundle\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.173927 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-config\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.200353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jn9b\" (UniqueName: \"kubernetes.io/projected/b0625578-3b48-44c7-9082-174fce3a7e74-kube-api-access-9jn9b\") pod \"neutron-db-sync-v689b\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.204838 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-x8hbr"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.225560 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zf2tx"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.226584 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.232496 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.232706 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.232817 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4tg2g" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.234701 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zf2tx"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.243526 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8d9vp"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.245005 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.246598 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v689b" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.250951 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8d9vp"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262702 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-run-httpd\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262723 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-combined-ca-bundle\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262744 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262766 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-config-data\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-log-httpd\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-db-sync-config-data\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262858 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-scripts\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262883 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-combined-ca-bundle\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cvd2\" (UniqueName: \"kubernetes.io/projected/5a423229-06be-4934-9715-58105e1af686-kube-api-access-8cvd2\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4tmg\" (UniqueName: \"kubernetes.io/projected/9ae05919-68bf-43d1-abd9-9908ec287bd0-kube-api-access-d4tmg\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262950 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-scripts\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262965 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-config-data\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.262983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ae05919-68bf-43d1-abd9-9908ec287bd0-etc-machine-id\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.263015 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxkw\" (UniqueName: \"kubernetes.io/projected/de415c6e-4424-49c4-bc9d-076a5b13ab4e-kube-api-access-fvxkw\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.263032 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-db-sync-config-data\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.265325 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-run-httpd\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.269382 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-log-httpd\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.271159 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.272459 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.277928 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-db-sync-config-data\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.278742 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-combined-ca-bundle\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.280410 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-scripts\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.285096 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cvd2\" (UniqueName: \"kubernetes.io/projected/5a423229-06be-4934-9715-58105e1af686-kube-api-access-8cvd2\") pod \"barbican-db-sync-pmk27\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.285994 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxkw\" (UniqueName: \"kubernetes.io/projected/de415c6e-4424-49c4-bc9d-076a5b13ab4e-kube-api-access-fvxkw\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.293366 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-config-data\") pod \"ceilometer-0\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364721 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ae05919-68bf-43d1-abd9-9908ec287bd0-etc-machine-id\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlv8\" (UniqueName: \"kubernetes.io/projected/e329efba-60e3-49c7-81ff-b073be77e34b-kube-api-access-zmlv8\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364839 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-db-sync-config-data\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364868 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-config-data\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364890 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-logs\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364958 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.364987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-config\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365038 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365101 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8x5b\" (UniqueName: \"kubernetes.io/projected/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-kube-api-access-d8x5b\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365131 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-combined-ca-bundle\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365224 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365347 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-combined-ca-bundle\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365385 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4tmg\" (UniqueName: \"kubernetes.io/projected/9ae05919-68bf-43d1-abd9-9908ec287bd0-kube-api-access-d4tmg\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365425 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-scripts\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365461 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-config-data\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365490 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-scripts\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.365599 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ae05919-68bf-43d1-abd9-9908ec287bd0-etc-machine-id\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.368512 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.372428 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-combined-ca-bundle\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.372447 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-db-sync-config-data\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.386023 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-config-data\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.386538 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-scripts\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.392373 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4tmg\" (UniqueName: \"kubernetes.io/projected/9ae05919-68bf-43d1-abd9-9908ec287bd0-kube-api-access-d4tmg\") pod \"cinder-db-sync-8l85z\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.441695 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmk27" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.466884 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-scripts\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.466948 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlv8\" (UniqueName: \"kubernetes.io/projected/e329efba-60e3-49c7-81ff-b073be77e34b-kube-api-access-zmlv8\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.466971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-config-data\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.466988 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-logs\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467036 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467056 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-config\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467095 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8x5b\" (UniqueName: \"kubernetes.io/projected/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-kube-api-access-d8x5b\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467151 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-combined-ca-bundle\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467180 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467201 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.467955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.469440 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-config\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.470607 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-logs\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.471204 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.474622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.475693 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-scripts\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.476592 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-config-data\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.476750 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-combined-ca-bundle\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.477712 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.488054 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlv8\" (UniqueName: \"kubernetes.io/projected/e329efba-60e3-49c7-81ff-b073be77e34b-kube-api-access-zmlv8\") pod \"dnsmasq-dns-76fcf4b695-8d9vp\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.491687 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8x5b\" (UniqueName: \"kubernetes.io/projected/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-kube-api-access-d8x5b\") pod \"placement-db-sync-zf2tx\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.533331 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8l85z" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.557573 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zf2tx" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.573543 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.709982 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-x8hbr"] Jan 27 15:26:48 crc kubenswrapper[4772]: W0127 15:26:48.766573 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34ee743a_1628_42e4_a465_0e3957cae089.slice/crio-e4869b167a5dcad1e31d6b3e55e70bfbc70108cabaa63e1ee42f306d56b633d7 WatchSource:0}: Error finding container e4869b167a5dcad1e31d6b3e55e70bfbc70108cabaa63e1ee42f306d56b633d7: Status 404 returned error can't find the container with id e4869b167a5dcad1e31d6b3e55e70bfbc70108cabaa63e1ee42f306d56b633d7 Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.882645 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rl8kf"] Jan 27 15:26:48 crc kubenswrapper[4772]: I0127 15:26:48.914049 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-v689b"] Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.045810 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.208055 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pmk27"] Jan 27 15:26:49 crc kubenswrapper[4772]: W0127 15:26:49.213762 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a423229_06be_4934_9715_58105e1af686.slice/crio-604f080c5b545eb272e78d6599f0497ec22c32b54d41f8331dbefcd9a29b19de WatchSource:0}: Error finding container 604f080c5b545eb272e78d6599f0497ec22c32b54d41f8331dbefcd9a29b19de: Status 404 returned error can't find the container with id 604f080c5b545eb272e78d6599f0497ec22c32b54d41f8331dbefcd9a29b19de Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.318960 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8l85z"] Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.380724 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zf2tx"] Jan 27 15:26:49 crc kubenswrapper[4772]: W0127 15:26:49.385128 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode329efba_60e3_49c7_81ff_b073be77e34b.slice/crio-dc924b979634b0cd0c7264ffb70a5a244bf22da4a19a82562f283a10b69f4841 WatchSource:0}: Error finding container dc924b979634b0cd0c7264ffb70a5a244bf22da4a19a82562f283a10b69f4841: Status 404 returned error can't find the container with id dc924b979634b0cd0c7264ffb70a5a244bf22da4a19a82562f283a10b69f4841 Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.389548 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8d9vp"] Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.490194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerStarted","Data":"a0254acb416eb806ca40cead3274ef3b55185c0cdbabec25da60a2a08040318a"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.492706 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zf2tx" event={"ID":"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6","Type":"ContainerStarted","Data":"7dfb28db1e03cbbc36a413590b93de83567a6c9fa02be76267be1180098e9795"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.494551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v689b" event={"ID":"b0625578-3b48-44c7-9082-174fce3a7e74","Type":"ContainerStarted","Data":"b3123ce803c91e7738d4af911f91769cd0703aad347549f4989b2ccc532f36ea"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.495513 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8l85z" event={"ID":"9ae05919-68bf-43d1-abd9-9908ec287bd0","Type":"ContainerStarted","Data":"6c7bfeb67dfdf4e440bd40114d111aab9077e461f93cb6bdda5f337cad29c97d"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.498318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmk27" event={"ID":"5a423229-06be-4934-9715-58105e1af686","Type":"ContainerStarted","Data":"604f080c5b545eb272e78d6599f0497ec22c32b54d41f8331dbefcd9a29b19de"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.500149 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" event={"ID":"e329efba-60e3-49c7-81ff-b073be77e34b","Type":"ContainerStarted","Data":"dc924b979634b0cd0c7264ffb70a5a244bf22da4a19a82562f283a10b69f4841"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.509778 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rl8kf" event={"ID":"0851ad59-841c-4133-a043-13d2cfdb0803","Type":"ContainerStarted","Data":"c20becc5003b571ca45e8d820a72a46ddfed0eee505f84347fd06aa34646e7c4"} Jan 27 15:26:49 crc kubenswrapper[4772]: I0127 15:26:49.511224 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" event={"ID":"34ee743a-1628-42e4-a465-0e3957cae089","Type":"ContainerStarted","Data":"e4869b167a5dcad1e31d6b3e55e70bfbc70108cabaa63e1ee42f306d56b633d7"} Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.517909 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.523439 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rl8kf" event={"ID":"0851ad59-841c-4133-a043-13d2cfdb0803","Type":"ContainerStarted","Data":"c63a10e019701dbe41c4487398c76cb4acdd6a0eda99f6edb9df7d6273b71a27"} Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.530555 4772 generic.go:334] "Generic (PLEG): container finished" podID="34ee743a-1628-42e4-a465-0e3957cae089" containerID="d73a0cc0e4bda3bf05abf4b518f94f7eadee3b5d18eccad0b9033c86816467f5" exitCode=0 Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.530668 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" event={"ID":"34ee743a-1628-42e4-a465-0e3957cae089","Type":"ContainerDied","Data":"d73a0cc0e4bda3bf05abf4b518f94f7eadee3b5d18eccad0b9033c86816467f5"} Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.534894 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v689b" event={"ID":"b0625578-3b48-44c7-9082-174fce3a7e74","Type":"ContainerStarted","Data":"d2de8b3a1c27ebd01b5c3393c6dcb85d202fe549eef0c41d0f9f318c3b15d219"} Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.536763 4772 generic.go:334] "Generic (PLEG): container finished" podID="e329efba-60e3-49c7-81ff-b073be77e34b" containerID="10196ecd014a671d2bb0c35a007cf89f6cc32f81e1a8290e8f8bb5f8f7575614" exitCode=0 Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.536809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" event={"ID":"e329efba-60e3-49c7-81ff-b073be77e34b","Type":"ContainerDied","Data":"10196ecd014a671d2bb0c35a007cf89f6cc32f81e1a8290e8f8bb5f8f7575614"} Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.583440 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rl8kf" podStartSLOduration=3.583396222 podStartE2EDuration="3.583396222s" podCreationTimestamp="2026-01-27 15:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:50.540015872 +0000 UTC m=+1196.520624970" watchObservedRunningTime="2026-01-27 15:26:50.583396222 +0000 UTC m=+1196.564005330" Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.598509 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-v689b" podStartSLOduration=3.598487167 podStartE2EDuration="3.598487167s" podCreationTimestamp="2026-01-27 15:26:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:50.554085038 +0000 UTC m=+1196.534694136" watchObservedRunningTime="2026-01-27 15:26:50.598487167 +0000 UTC m=+1196.579096265" Jan 27 15:26:50 crc kubenswrapper[4772]: I0127 15:26:50.955266 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.133614 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-sb\") pod \"34ee743a-1628-42e4-a465-0e3957cae089\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.133696 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-svc\") pod \"34ee743a-1628-42e4-a465-0e3957cae089\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.133754 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx2xr\" (UniqueName: \"kubernetes.io/projected/34ee743a-1628-42e4-a465-0e3957cae089-kube-api-access-tx2xr\") pod \"34ee743a-1628-42e4-a465-0e3957cae089\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.133986 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-nb\") pod \"34ee743a-1628-42e4-a465-0e3957cae089\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.134017 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-config\") pod \"34ee743a-1628-42e4-a465-0e3957cae089\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.134042 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-swift-storage-0\") pod \"34ee743a-1628-42e4-a465-0e3957cae089\" (UID: \"34ee743a-1628-42e4-a465-0e3957cae089\") " Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.153357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ee743a-1628-42e4-a465-0e3957cae089-kube-api-access-tx2xr" (OuterVolumeSpecName: "kube-api-access-tx2xr") pod "34ee743a-1628-42e4-a465-0e3957cae089" (UID: "34ee743a-1628-42e4-a465-0e3957cae089"). InnerVolumeSpecName "kube-api-access-tx2xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.175290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34ee743a-1628-42e4-a465-0e3957cae089" (UID: "34ee743a-1628-42e4-a465-0e3957cae089"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.182600 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-config" (OuterVolumeSpecName: "config") pod "34ee743a-1628-42e4-a465-0e3957cae089" (UID: "34ee743a-1628-42e4-a465-0e3957cae089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.186546 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34ee743a-1628-42e4-a465-0e3957cae089" (UID: "34ee743a-1628-42e4-a465-0e3957cae089"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.191810 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34ee743a-1628-42e4-a465-0e3957cae089" (UID: "34ee743a-1628-42e4-a465-0e3957cae089"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.236436 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.236467 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.236477 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.236485 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.236493 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx2xr\" (UniqueName: \"kubernetes.io/projected/34ee743a-1628-42e4-a465-0e3957cae089-kube-api-access-tx2xr\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.270155 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34ee743a-1628-42e4-a465-0e3957cae089" (UID: "34ee743a-1628-42e4-a465-0e3957cae089"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.338022 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34ee743a-1628-42e4-a465-0e3957cae089-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.568529 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" event={"ID":"34ee743a-1628-42e4-a465-0e3957cae089","Type":"ContainerDied","Data":"e4869b167a5dcad1e31d6b3e55e70bfbc70108cabaa63e1ee42f306d56b633d7"} Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.568541 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-x8hbr" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.568578 4772 scope.go:117] "RemoveContainer" containerID="d73a0cc0e4bda3bf05abf4b518f94f7eadee3b5d18eccad0b9033c86816467f5" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.572018 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" event={"ID":"e329efba-60e3-49c7-81ff-b073be77e34b","Type":"ContainerStarted","Data":"0bc2eb78a83f1e9ddf6e0c975669640d497ff3e50951b0eaadaee82dc03caffd"} Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.572224 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.599069 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" podStartSLOduration=3.599051722 podStartE2EDuration="3.599051722s" podCreationTimestamp="2026-01-27 15:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:26:51.592971947 +0000 UTC m=+1197.573581045" watchObservedRunningTime="2026-01-27 15:26:51.599051722 +0000 UTC m=+1197.579660820" Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.646414 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-x8hbr"] Jan 27 15:26:51 crc kubenswrapper[4772]: I0127 15:26:51.659768 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-x8hbr"] Jan 27 15:26:52 crc kubenswrapper[4772]: I0127 15:26:52.680528 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ee743a-1628-42e4-a465-0e3957cae089" path="/var/lib/kubelet/pods/34ee743a-1628-42e4-a465-0e3957cae089/volumes" Jan 27 15:26:53 crc kubenswrapper[4772]: I0127 15:26:53.593806 4772 generic.go:334] "Generic (PLEG): container finished" podID="86d0241f-ae16-400f-837c-3b43c904c91e" containerID="317ff691da5e191e31778e1d02f29484703e057687e372739fcbc9dd6f8088d2" exitCode=0 Jan 27 15:26:53 crc kubenswrapper[4772]: I0127 15:26:53.593866 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vdmv7" event={"ID":"86d0241f-ae16-400f-837c-3b43c904c91e","Type":"ContainerDied","Data":"317ff691da5e191e31778e1d02f29484703e057687e372739fcbc9dd6f8088d2"} Jan 27 15:26:54 crc kubenswrapper[4772]: I0127 15:26:54.604809 4772 generic.go:334] "Generic (PLEG): container finished" podID="0851ad59-841c-4133-a043-13d2cfdb0803" containerID="c63a10e019701dbe41c4487398c76cb4acdd6a0eda99f6edb9df7d6273b71a27" exitCode=0 Jan 27 15:26:54 crc kubenswrapper[4772]: I0127 15:26:54.605280 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rl8kf" event={"ID":"0851ad59-841c-4133-a043-13d2cfdb0803","Type":"ContainerDied","Data":"c63a10e019701dbe41c4487398c76cb4acdd6a0eda99f6edb9df7d6273b71a27"} Jan 27 15:26:58 crc kubenswrapper[4772]: I0127 15:26:58.575351 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:26:58 crc kubenswrapper[4772]: I0127 15:26:58.677717 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-lll86"] Jan 27 15:26:58 crc kubenswrapper[4772]: I0127 15:26:58.678160 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" containerID="cri-o://ded8f7e741d736bdfe8cef79d54407ecbfa8926bb6d56e27836f39ea6ec4c8ef" gracePeriod=10 Jan 27 15:26:59 crc kubenswrapper[4772]: I0127 15:26:59.325550 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.120:5353: connect: connection refused" Jan 27 15:26:59 crc kubenswrapper[4772]: I0127 15:26:59.678062 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerID="ded8f7e741d736bdfe8cef79d54407ecbfa8926bb6d56e27836f39ea6ec4c8ef" exitCode=0 Jan 27 15:26:59 crc kubenswrapper[4772]: I0127 15:26:59.678119 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" event={"ID":"b6aa637d-4418-4fa4-8a26-249446d2fb3f","Type":"ContainerDied","Data":"ded8f7e741d736bdfe8cef79d54407ecbfa8926bb6d56e27836f39ea6ec4c8ef"} Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.631856 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vdmv7" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.698999 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-vdmv7" event={"ID":"86d0241f-ae16-400f-837c-3b43c904c91e","Type":"ContainerDied","Data":"a28e441b5d88b4432b8107753f5714c4987db6ac6635bc08e9f396a0b42288ec"} Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.699038 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28e441b5d88b4432b8107753f5714c4987db6ac6635bc08e9f396a0b42288ec" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.699118 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-vdmv7" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.762192 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-combined-ca-bundle\") pod \"86d0241f-ae16-400f-837c-3b43c904c91e\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.762392 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-config-data\") pod \"86d0241f-ae16-400f-837c-3b43c904c91e\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.762454 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkbkr\" (UniqueName: \"kubernetes.io/projected/86d0241f-ae16-400f-837c-3b43c904c91e-kube-api-access-kkbkr\") pod \"86d0241f-ae16-400f-837c-3b43c904c91e\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.762687 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-db-sync-config-data\") pod \"86d0241f-ae16-400f-837c-3b43c904c91e\" (UID: \"86d0241f-ae16-400f-837c-3b43c904c91e\") " Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.769292 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86d0241f-ae16-400f-837c-3b43c904c91e-kube-api-access-kkbkr" (OuterVolumeSpecName: "kube-api-access-kkbkr") pod "86d0241f-ae16-400f-837c-3b43c904c91e" (UID: "86d0241f-ae16-400f-837c-3b43c904c91e"). InnerVolumeSpecName "kube-api-access-kkbkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.770051 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "86d0241f-ae16-400f-837c-3b43c904c91e" (UID: "86d0241f-ae16-400f-837c-3b43c904c91e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.794875 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86d0241f-ae16-400f-837c-3b43c904c91e" (UID: "86d0241f-ae16-400f-837c-3b43c904c91e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.816271 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-config-data" (OuterVolumeSpecName: "config-data") pod "86d0241f-ae16-400f-837c-3b43c904c91e" (UID: "86d0241f-ae16-400f-837c-3b43c904c91e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.868407 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.870806 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.870936 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86d0241f-ae16-400f-837c-3b43c904c91e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:01 crc kubenswrapper[4772]: I0127 15:27:01.870998 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkbkr\" (UniqueName: \"kubernetes.io/projected/86d0241f-ae16-400f-837c-3b43c904c91e-kube-api-access-kkbkr\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.149976 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dqgvx"] Jan 27 15:27:03 crc kubenswrapper[4772]: E0127 15:27:03.150398 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ee743a-1628-42e4-a465-0e3957cae089" containerName="init" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.150413 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ee743a-1628-42e4-a465-0e3957cae089" containerName="init" Jan 27 15:27:03 crc kubenswrapper[4772]: E0127 15:27:03.150440 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d0241f-ae16-400f-837c-3b43c904c91e" containerName="glance-db-sync" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.150448 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d0241f-ae16-400f-837c-3b43c904c91e" containerName="glance-db-sync" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.150630 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="86d0241f-ae16-400f-837c-3b43c904c91e" containerName="glance-db-sync" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.150649 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ee743a-1628-42e4-a465-0e3957cae089" containerName="init" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.153683 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.190280 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dqgvx"] Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.211135 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.211312 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzhkt\" (UniqueName: \"kubernetes.io/projected/17a547a9-a098-43b7-a153-ad9a137369de-kube-api-access-xzhkt\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.211368 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.211393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-config\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.211431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.211462 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.313669 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.313783 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzhkt\" (UniqueName: \"kubernetes.io/projected/17a547a9-a098-43b7-a153-ad9a137369de-kube-api-access-xzhkt\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.313813 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.313831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-config\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.313854 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.313874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.314633 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.314658 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.314901 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-config\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.314911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.315214 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.335719 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzhkt\" (UniqueName: \"kubernetes.io/projected/17a547a9-a098-43b7-a153-ad9a137369de-kube-api-access-xzhkt\") pod \"dnsmasq-dns-8b5c85b87-dqgvx\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.476919 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.987208 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.990363 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.992882 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.992956 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.994071 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vd4fn" Jan 27 15:27:03 crc kubenswrapper[4772]: I0127 15:27:03.998542 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025044 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wphqd\" (UniqueName: \"kubernetes.io/projected/42e34d70-0be1-400d-b214-62ba7d9e2e09-kube-api-access-wphqd\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025186 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025227 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-scripts\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025252 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025337 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-logs\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.025388 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-config-data\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-logs\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126582 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-config-data\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126641 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126700 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wphqd\" (UniqueName: \"kubernetes.io/projected/42e34d70-0be1-400d-b214-62ba7d9e2e09-kube-api-access-wphqd\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126768 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-scripts\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.126791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.127263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-logs\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.127320 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.127683 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.133230 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.141810 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-scripts\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.143033 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-config-data\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.146015 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wphqd\" (UniqueName: \"kubernetes.io/projected/42e34d70-0be1-400d-b214-62ba7d9e2e09-kube-api-access-wphqd\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.168413 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.315519 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.401453 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.403935 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.407004 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.415362 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432700 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432759 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432780 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432805 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.432919 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rkbs\" (UniqueName: \"kubernetes.io/projected/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-kube-api-access-6rkbs\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.534283 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.534338 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.534372 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.534423 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.534513 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.534523 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.535026 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.535047 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rkbs\" (UniqueName: \"kubernetes.io/projected/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-kube-api-access-6rkbs\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.535442 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-logs\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.535464 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.541716 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.544571 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.544571 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.550827 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rkbs\" (UniqueName: \"kubernetes.io/projected/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-kube-api-access-6rkbs\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.558863 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:04 crc kubenswrapper[4772]: I0127 15:27:04.732146 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:05 crc kubenswrapper[4772]: I0127 15:27:05.557405 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:05 crc kubenswrapper[4772]: I0127 15:27:05.637395 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:09 crc kubenswrapper[4772]: I0127 15:27:09.325746 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.120:5353: i/o timeout" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.192996 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.200024 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298021 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-fernet-keys\") pod \"0851ad59-841c-4133-a043-13d2cfdb0803\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298081 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-combined-ca-bundle\") pod \"0851ad59-841c-4133-a043-13d2cfdb0803\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298242 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-config-data\") pod \"0851ad59-841c-4133-a043-13d2cfdb0803\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298283 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtsjh\" (UniqueName: \"kubernetes.io/projected/b6aa637d-4418-4fa4-8a26-249446d2fb3f-kube-api-access-gtsjh\") pod \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298335 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwktg\" (UniqueName: \"kubernetes.io/projected/0851ad59-841c-4133-a043-13d2cfdb0803-kube-api-access-fwktg\") pod \"0851ad59-841c-4133-a043-13d2cfdb0803\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298363 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-nb\") pod \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298403 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-credential-keys\") pod \"0851ad59-841c-4133-a043-13d2cfdb0803\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298438 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-swift-storage-0\") pod \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298464 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-sb\") pod \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298527 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-svc\") pod \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298575 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-config\") pod \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\" (UID: \"b6aa637d-4418-4fa4-8a26-249446d2fb3f\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.298600 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-scripts\") pod \"0851ad59-841c-4133-a043-13d2cfdb0803\" (UID: \"0851ad59-841c-4133-a043-13d2cfdb0803\") " Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.304618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6aa637d-4418-4fa4-8a26-249446d2fb3f-kube-api-access-gtsjh" (OuterVolumeSpecName: "kube-api-access-gtsjh") pod "b6aa637d-4418-4fa4-8a26-249446d2fb3f" (UID: "b6aa637d-4418-4fa4-8a26-249446d2fb3f"). InnerVolumeSpecName "kube-api-access-gtsjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.305912 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0851ad59-841c-4133-a043-13d2cfdb0803" (UID: "0851ad59-841c-4133-a043-13d2cfdb0803"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.306794 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-scripts" (OuterVolumeSpecName: "scripts") pod "0851ad59-841c-4133-a043-13d2cfdb0803" (UID: "0851ad59-841c-4133-a043-13d2cfdb0803"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.307925 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0851ad59-841c-4133-a043-13d2cfdb0803-kube-api-access-fwktg" (OuterVolumeSpecName: "kube-api-access-fwktg") pod "0851ad59-841c-4133-a043-13d2cfdb0803" (UID: "0851ad59-841c-4133-a043-13d2cfdb0803"). InnerVolumeSpecName "kube-api-access-fwktg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.309450 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0851ad59-841c-4133-a043-13d2cfdb0803" (UID: "0851ad59-841c-4133-a043-13d2cfdb0803"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.336610 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0851ad59-841c-4133-a043-13d2cfdb0803" (UID: "0851ad59-841c-4133-a043-13d2cfdb0803"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.340435 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-config-data" (OuterVolumeSpecName: "config-data") pod "0851ad59-841c-4133-a043-13d2cfdb0803" (UID: "0851ad59-841c-4133-a043-13d2cfdb0803"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.350444 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6aa637d-4418-4fa4-8a26-249446d2fb3f" (UID: "b6aa637d-4418-4fa4-8a26-249446d2fb3f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.353026 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-config" (OuterVolumeSpecName: "config") pod "b6aa637d-4418-4fa4-8a26-249446d2fb3f" (UID: "b6aa637d-4418-4fa4-8a26-249446d2fb3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.355265 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6aa637d-4418-4fa4-8a26-249446d2fb3f" (UID: "b6aa637d-4418-4fa4-8a26-249446d2fb3f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.357334 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6aa637d-4418-4fa4-8a26-249446d2fb3f" (UID: "b6aa637d-4418-4fa4-8a26-249446d2fb3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.361342 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6aa637d-4418-4fa4-8a26-249446d2fb3f" (UID: "b6aa637d-4418-4fa4-8a26-249446d2fb3f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400582 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400612 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwktg\" (UniqueName: \"kubernetes.io/projected/0851ad59-841c-4133-a043-13d2cfdb0803-kube-api-access-fwktg\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400662 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400675 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400686 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400699 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400709 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6aa637d-4418-4fa4-8a26-249446d2fb3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400719 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400727 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400735 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400745 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0851ad59-841c-4133-a043-13d2cfdb0803-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.400754 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtsjh\" (UniqueName: \"kubernetes.io/projected/b6aa637d-4418-4fa4-8a26-249446d2fb3f-kube-api-access-gtsjh\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:13 crc kubenswrapper[4772]: E0127 15:27:13.670892 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 27 15:27:13 crc kubenswrapper[4772]: E0127 15:27:13.671110 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cvd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-pmk27_openstack(5a423229-06be-4934-9715-58105e1af686): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:27:13 crc kubenswrapper[4772]: E0127 15:27:13.672281 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-pmk27" podUID="5a423229-06be-4934-9715-58105e1af686" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.815019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rl8kf" event={"ID":"0851ad59-841c-4133-a043-13d2cfdb0803","Type":"ContainerDied","Data":"c20becc5003b571ca45e8d820a72a46ddfed0eee505f84347fd06aa34646e7c4"} Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.815097 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rl8kf" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.815154 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c20becc5003b571ca45e8d820a72a46ddfed0eee505f84347fd06aa34646e7c4" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.818335 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.818344 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" event={"ID":"b6aa637d-4418-4fa4-8a26-249446d2fb3f","Type":"ContainerDied","Data":"353300bf1914ec8c1fafaa4dfe7633842f95697653e6f9ec7954d70422c9cfbd"} Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.818442 4772 scope.go:117] "RemoveContainer" containerID="ded8f7e741d736bdfe8cef79d54407ecbfa8926bb6d56e27836f39ea6ec4c8ef" Jan 27 15:27:13 crc kubenswrapper[4772]: E0127 15:27:13.819580 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-pmk27" podUID="5a423229-06be-4934-9715-58105e1af686" Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.876817 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-lll86"] Jan 27 15:27:13 crc kubenswrapper[4772]: I0127 15:27:13.884542 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-lll86"] Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.281613 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rl8kf"] Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.291277 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rl8kf"] Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.326812 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-lll86" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.120:5353: i/o timeout" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.388145 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fl4nt"] Jan 27 15:27:14 crc kubenswrapper[4772]: E0127 15:27:14.388567 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.388588 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" Jan 27 15:27:14 crc kubenswrapper[4772]: E0127 15:27:14.388612 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0851ad59-841c-4133-a043-13d2cfdb0803" containerName="keystone-bootstrap" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.388621 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0851ad59-841c-4133-a043-13d2cfdb0803" containerName="keystone-bootstrap" Jan 27 15:27:14 crc kubenswrapper[4772]: E0127 15:27:14.388634 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="init" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.388644 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="init" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.388889 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" containerName="dnsmasq-dns" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.388911 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0851ad59-841c-4133-a043-13d2cfdb0803" containerName="keystone-bootstrap" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.389573 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.392077 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdjsw" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.392313 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.392493 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.392898 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.393078 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.398252 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fl4nt"] Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.534871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-config-data\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.535329 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968vm\" (UniqueName: \"kubernetes.io/projected/8322baad-60c1-4d0b-96e3-51038f2e447a-kube-api-access-968vm\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.535408 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-fernet-keys\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.535438 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-scripts\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.535464 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-combined-ca-bundle\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.535517 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-credential-keys\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.637706 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-config-data\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.637816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968vm\" (UniqueName: \"kubernetes.io/projected/8322baad-60c1-4d0b-96e3-51038f2e447a-kube-api-access-968vm\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.637900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-fernet-keys\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.637929 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-scripts\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.637954 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-combined-ca-bundle\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.638014 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-credential-keys\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.641946 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-config-data\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.642141 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-fernet-keys\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.642835 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-combined-ca-bundle\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.643113 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-credential-keys\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.652308 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-scripts\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.655857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968vm\" (UniqueName: \"kubernetes.io/projected/8322baad-60c1-4d0b-96e3-51038f2e447a-kube-api-access-968vm\") pod \"keystone-bootstrap-fl4nt\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.673596 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0851ad59-841c-4133-a043-13d2cfdb0803" path="/var/lib/kubelet/pods/0851ad59-841c-4133-a043-13d2cfdb0803/volumes" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.674339 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6aa637d-4418-4fa4-8a26-249446d2fb3f" path="/var/lib/kubelet/pods/b6aa637d-4418-4fa4-8a26-249446d2fb3f/volumes" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.749079 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:14 crc kubenswrapper[4772]: I0127 15:27:14.883051 4772 scope.go:117] "RemoveContainer" containerID="26a0610819b472b19e1babe3f9b5893ac7bd92b0c9047d536f0dadb42db99a12" Jan 27 15:27:14 crc kubenswrapper[4772]: E0127 15:27:14.937626 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 27 15:27:14 crc kubenswrapper[4772]: E0127 15:27:14.937873 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4tmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-8l85z_openstack(9ae05919-68bf-43d1-abd9-9908ec287bd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 15:27:14 crc kubenswrapper[4772]: E0127 15:27:14.943014 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-8l85z" podUID="9ae05919-68bf-43d1-abd9-9908ec287bd0" Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.367950 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dqgvx"] Jan 27 15:27:15 crc kubenswrapper[4772]: W0127 15:27:15.374800 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17a547a9_a098_43b7_a153_ad9a137369de.slice/crio-31e83f6ba26ca249b5435d61c8786bdc24b0777adb10cbb234cdaacbda3e0db7 WatchSource:0}: Error finding container 31e83f6ba26ca249b5435d61c8786bdc24b0777adb10cbb234cdaacbda3e0db7: Status 404 returned error can't find the container with id 31e83f6ba26ca249b5435d61c8786bdc24b0777adb10cbb234cdaacbda3e0db7 Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.501618 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fl4nt"] Jan 27 15:27:15 crc kubenswrapper[4772]: W0127 15:27:15.505752 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8322baad_60c1_4d0b_96e3_51038f2e447a.slice/crio-00f64e139d84dd0a1b89ff44770726f3c1e45680dfa15727e0c2287a16f89b5d WatchSource:0}: Error finding container 00f64e139d84dd0a1b89ff44770726f3c1e45680dfa15727e0c2287a16f89b5d: Status 404 returned error can't find the container with id 00f64e139d84dd0a1b89ff44770726f3c1e45680dfa15727e0c2287a16f89b5d Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.851966 4772 generic.go:334] "Generic (PLEG): container finished" podID="17a547a9-a098-43b7-a153-ad9a137369de" containerID="dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023" exitCode=0 Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.852566 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" event={"ID":"17a547a9-a098-43b7-a153-ad9a137369de","Type":"ContainerDied","Data":"dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023"} Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.852607 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" event={"ID":"17a547a9-a098-43b7-a153-ad9a137369de","Type":"ContainerStarted","Data":"31e83f6ba26ca249b5435d61c8786bdc24b0777adb10cbb234cdaacbda3e0db7"} Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.858714 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.861147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerStarted","Data":"f76b5eae8b9d1fd746edffe9a9f5a02ca0ad4ea09665e63c5dbeacff4753fa40"} Jan 27 15:27:15 crc kubenswrapper[4772]: W0127 15:27:15.869505 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42e34d70_0be1_400d_b214_62ba7d9e2e09.slice/crio-38fe91c20bb8dce720ba73a97f9c737c6e74e3b73cf336c9f1c1013c6d14e07f WatchSource:0}: Error finding container 38fe91c20bb8dce720ba73a97f9c737c6e74e3b73cf336c9f1c1013c6d14e07f: Status 404 returned error can't find the container with id 38fe91c20bb8dce720ba73a97f9c737c6e74e3b73cf336c9f1c1013c6d14e07f Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.872768 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zf2tx" event={"ID":"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6","Type":"ContainerStarted","Data":"d2b29cba9bcd684a9fa3005c73cbd809102e0bb6c21ef6ed5d53662bb4cdcdaa"} Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.887361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl4nt" event={"ID":"8322baad-60c1-4d0b-96e3-51038f2e447a","Type":"ContainerStarted","Data":"5de6bd74908b324e47419d9f37b784b689e01e1c833ca0e1c7d7483a1e19037c"} Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.887400 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl4nt" event={"ID":"8322baad-60c1-4d0b-96e3-51038f2e447a","Type":"ContainerStarted","Data":"00f64e139d84dd0a1b89ff44770726f3c1e45680dfa15727e0c2287a16f89b5d"} Jan 27 15:27:15 crc kubenswrapper[4772]: E0127 15:27:15.891542 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-8l85z" podUID="9ae05919-68bf-43d1-abd9-9908ec287bd0" Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.908000 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zf2tx" podStartSLOduration=2.441509563 podStartE2EDuration="27.907978432s" podCreationTimestamp="2026-01-27 15:26:48 +0000 UTC" firstStartedPulling="2026-01-27 15:26:49.382637438 +0000 UTC m=+1195.363246536" lastFinishedPulling="2026-01-27 15:27:14.849106307 +0000 UTC m=+1220.829715405" observedRunningTime="2026-01-27 15:27:15.904495631 +0000 UTC m=+1221.885104729" watchObservedRunningTime="2026-01-27 15:27:15.907978432 +0000 UTC m=+1221.888587530" Jan 27 15:27:15 crc kubenswrapper[4772]: I0127 15:27:15.951848 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fl4nt" podStartSLOduration=1.9518243050000001 podStartE2EDuration="1.951824305s" podCreationTimestamp="2026-01-27 15:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:15.945598765 +0000 UTC m=+1221.926207873" watchObservedRunningTime="2026-01-27 15:27:15.951824305 +0000 UTC m=+1221.932433403" Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.752189 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:16 crc kubenswrapper[4772]: W0127 15:27:16.765645 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc3206f0_7c01_44c9_9d6a_c586a9b25db8.slice/crio-9244a9db558588c4d27b4c9d5dab7473287ca2114db14b6dfae8075b2b13cc13 WatchSource:0}: Error finding container 9244a9db558588c4d27b4c9d5dab7473287ca2114db14b6dfae8075b2b13cc13: Status 404 returned error can't find the container with id 9244a9db558588c4d27b4c9d5dab7473287ca2114db14b6dfae8075b2b13cc13 Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.928751 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerStarted","Data":"ed51d0aa4ae1c7166bbf0464f2b405f79a0faa50f99c4244c9717d1a1fd81db2"} Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.931423 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" event={"ID":"17a547a9-a098-43b7-a153-ad9a137369de","Type":"ContainerStarted","Data":"9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac"} Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.931614 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.933911 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc3206f0-7c01-44c9-9d6a-c586a9b25db8","Type":"ContainerStarted","Data":"9244a9db558588c4d27b4c9d5dab7473287ca2114db14b6dfae8075b2b13cc13"} Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.944717 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42e34d70-0be1-400d-b214-62ba7d9e2e09","Type":"ContainerStarted","Data":"d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64"} Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.944799 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42e34d70-0be1-400d-b214-62ba7d9e2e09","Type":"ContainerStarted","Data":"38fe91c20bb8dce720ba73a97f9c737c6e74e3b73cf336c9f1c1013c6d14e07f"} Jan 27 15:27:16 crc kubenswrapper[4772]: I0127 15:27:16.951150 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" podStartSLOduration=13.951131055 podStartE2EDuration="13.951131055s" podCreationTimestamp="2026-01-27 15:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:16.950492416 +0000 UTC m=+1222.931101524" watchObservedRunningTime="2026-01-27 15:27:16.951131055 +0000 UTC m=+1222.931740153" Jan 27 15:27:17 crc kubenswrapper[4772]: I0127 15:27:17.963692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc3206f0-7c01-44c9-9d6a-c586a9b25db8","Type":"ContainerStarted","Data":"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e"} Jan 27 15:27:17 crc kubenswrapper[4772]: I0127 15:27:17.967049 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-log" containerID="cri-o://d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64" gracePeriod=30 Jan 27 15:27:17 crc kubenswrapper[4772]: I0127 15:27:17.967209 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42e34d70-0be1-400d-b214-62ba7d9e2e09","Type":"ContainerStarted","Data":"612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63"} Jan 27 15:27:17 crc kubenswrapper[4772]: I0127 15:27:17.967387 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-httpd" containerID="cri-o://612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63" gracePeriod=30 Jan 27 15:27:17 crc kubenswrapper[4772]: I0127 15:27:17.999908 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.999882739 podStartE2EDuration="15.999882739s" podCreationTimestamp="2026-01-27 15:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:17.993227517 +0000 UTC m=+1223.973836615" watchObservedRunningTime="2026-01-27 15:27:17.999882739 +0000 UTC m=+1223.980491837" Jan 27 15:27:18 crc kubenswrapper[4772]: I0127 15:27:18.982822 4772 generic.go:334] "Generic (PLEG): container finished" podID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerID="d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64" exitCode=143 Jan 27 15:27:18 crc kubenswrapper[4772]: I0127 15:27:18.983005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42e34d70-0be1-400d-b214-62ba7d9e2e09","Type":"ContainerDied","Data":"d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64"} Jan 27 15:27:19 crc kubenswrapper[4772]: I0127 15:27:19.984859 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.004979 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc3206f0-7c01-44c9-9d6a-c586a9b25db8","Type":"ContainerStarted","Data":"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144"} Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.005162 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-log" containerID="cri-o://dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e" gracePeriod=30 Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.005447 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-httpd" containerID="cri-o://e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144" gracePeriod=30 Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.018740 4772 generic.go:334] "Generic (PLEG): container finished" podID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerID="612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63" exitCode=0 Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.018789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42e34d70-0be1-400d-b214-62ba7d9e2e09","Type":"ContainerDied","Data":"612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63"} Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.018820 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"42e34d70-0be1-400d-b214-62ba7d9e2e09","Type":"ContainerDied","Data":"38fe91c20bb8dce720ba73a97f9c737c6e74e3b73cf336c9f1c1013c6d14e07f"} Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.018839 4772 scope.go:117] "RemoveContainer" containerID="612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.019016 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.043531 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=17.043510515 podStartE2EDuration="17.043510515s" podCreationTimestamp="2026-01-27 15:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:20.038255613 +0000 UTC m=+1226.018864731" watchObservedRunningTime="2026-01-27 15:27:20.043510515 +0000 UTC m=+1226.024119613" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.065414 4772 scope.go:117] "RemoveContainer" containerID="d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081098 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wphqd\" (UniqueName: \"kubernetes.io/projected/42e34d70-0be1-400d-b214-62ba7d9e2e09-kube-api-access-wphqd\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081236 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-combined-ca-bundle\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081367 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081430 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-logs\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081468 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-scripts\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081500 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-config-data\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.081548 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-httpd-run\") pod \"42e34d70-0be1-400d-b214-62ba7d9e2e09\" (UID: \"42e34d70-0be1-400d-b214-62ba7d9e2e09\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.082090 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.082494 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-logs" (OuterVolumeSpecName: "logs") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.087867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.088044 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-scripts" (OuterVolumeSpecName: "scripts") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.088119 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e34d70-0be1-400d-b214-62ba7d9e2e09-kube-api-access-wphqd" (OuterVolumeSpecName: "kube-api-access-wphqd") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "kube-api-access-wphqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.107140 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.127647 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-config-data" (OuterVolumeSpecName: "config-data") pod "42e34d70-0be1-400d-b214-62ba7d9e2e09" (UID: "42e34d70-0be1-400d-b214-62ba7d9e2e09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183735 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183765 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183773 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183781 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183790 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/42e34d70-0be1-400d-b214-62ba7d9e2e09-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183800 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wphqd\" (UniqueName: \"kubernetes.io/projected/42e34d70-0be1-400d-b214-62ba7d9e2e09-kube-api-access-wphqd\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.183810 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42e34d70-0be1-400d-b214-62ba7d9e2e09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.202061 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.226326 4772 scope.go:117] "RemoveContainer" containerID="612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63" Jan 27 15:27:20 crc kubenswrapper[4772]: E0127 15:27:20.226821 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63\": container with ID starting with 612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63 not found: ID does not exist" containerID="612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.226897 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63"} err="failed to get container status \"612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63\": rpc error: code = NotFound desc = could not find container \"612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63\": container with ID starting with 612151cd11eb1c096a4f60cba2e500a7912da9863b27317fc35d7db2a913ec63 not found: ID does not exist" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.227235 4772 scope.go:117] "RemoveContainer" containerID="d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64" Jan 27 15:27:20 crc kubenswrapper[4772]: E0127 15:27:20.228038 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64\": container with ID starting with d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64 not found: ID does not exist" containerID="d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.228084 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64"} err="failed to get container status \"d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64\": rpc error: code = NotFound desc = could not find container \"d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64\": container with ID starting with d9b6567e565b2918f93dacf3cb905248aa46c864651dde1d9ea774cbf20bdb64 not found: ID does not exist" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.285122 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.366703 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.394785 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.407656 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:20 crc kubenswrapper[4772]: E0127 15:27:20.408376 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-httpd" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.408397 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-httpd" Jan 27 15:27:20 crc kubenswrapper[4772]: E0127 15:27:20.408411 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-log" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.408418 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-log" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.408636 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-httpd" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.408670 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" containerName="glance-log" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.411701 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.418429 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.420075 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.420265 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499070 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q49rw\" (UniqueName: \"kubernetes.io/projected/c94a7cfa-28e2-4d52-85a1-d5586f162227-kube-api-access-q49rw\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499270 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499298 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-logs\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-scripts\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499432 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-config-data\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.499638 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.569639 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612062 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612154 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-logs\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612262 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-scripts\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612292 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612331 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-config-data\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612434 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612660 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612737 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q49rw\" (UniqueName: \"kubernetes.io/projected/c94a7cfa-28e2-4d52-85a1-d5586f162227-kube-api-access-q49rw\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.612785 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-logs\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.613442 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.613668 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.618295 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-scripts\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.621263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-config-data\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.622897 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.623664 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.635109 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q49rw\" (UniqueName: \"kubernetes.io/projected/c94a7cfa-28e2-4d52-85a1-d5586f162227-kube-api-access-q49rw\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.650612 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.673075 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e34d70-0be1-400d-b214-62ba7d9e2e09" path="/var/lib/kubelet/pods/42e34d70-0be1-400d-b214-62ba7d9e2e09/volumes" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.713989 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rkbs\" (UniqueName: \"kubernetes.io/projected/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-kube-api-access-6rkbs\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.714091 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-config-data\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.714563 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-scripts\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.714607 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-httpd-run\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.714646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.714674 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-logs\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.714722 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-combined-ca-bundle\") pod \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\" (UID: \"bc3206f0-7c01-44c9-9d6a-c586a9b25db8\") " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.715396 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.715526 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-logs" (OuterVolumeSpecName: "logs") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.719082 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-kube-api-access-6rkbs" (OuterVolumeSpecName: "kube-api-access-6rkbs") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "kube-api-access-6rkbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.719410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-scripts" (OuterVolumeSpecName: "scripts") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.719509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.737705 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.762856 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-config-data" (OuterVolumeSpecName: "config-data") pod "bc3206f0-7c01-44c9-9d6a-c586a9b25db8" (UID: "bc3206f0-7c01-44c9-9d6a-c586a9b25db8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.765967 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816731 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816767 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816802 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816812 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816820 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816831 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rkbs\" (UniqueName: \"kubernetes.io/projected/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-kube-api-access-6rkbs\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.816839 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc3206f0-7c01-44c9-9d6a-c586a9b25db8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.833694 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 27 15:27:20 crc kubenswrapper[4772]: I0127 15:27:20.918628 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031353 4772 generic.go:334] "Generic (PLEG): container finished" podID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerID="e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144" exitCode=143 Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031388 4772 generic.go:334] "Generic (PLEG): container finished" podID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerID="dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e" exitCode=143 Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc3206f0-7c01-44c9-9d6a-c586a9b25db8","Type":"ContainerDied","Data":"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144"} Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031480 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc3206f0-7c01-44c9-9d6a-c586a9b25db8","Type":"ContainerDied","Data":"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e"} Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031487 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031509 4772 scope.go:117] "RemoveContainer" containerID="e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.031497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"bc3206f0-7c01-44c9-9d6a-c586a9b25db8","Type":"ContainerDied","Data":"9244a9db558588c4d27b4c9d5dab7473287ca2114db14b6dfae8075b2b13cc13"} Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.041147 4772 generic.go:334] "Generic (PLEG): container finished" podID="8322baad-60c1-4d0b-96e3-51038f2e447a" containerID="5de6bd74908b324e47419d9f37b784b689e01e1c833ca0e1c7d7483a1e19037c" exitCode=0 Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.041234 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl4nt" event={"ID":"8322baad-60c1-4d0b-96e3-51038f2e447a","Type":"ContainerDied","Data":"5de6bd74908b324e47419d9f37b784b689e01e1c833ca0e1c7d7483a1e19037c"} Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.080206 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.082973 4772 scope.go:117] "RemoveContainer" containerID="dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.090542 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.107740 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:21 crc kubenswrapper[4772]: E0127 15:27:21.108358 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-log" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.108373 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-log" Jan 27 15:27:21 crc kubenswrapper[4772]: E0127 15:27:21.108400 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-httpd" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.108406 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-httpd" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.108553 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-httpd" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.108568 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" containerName="glance-log" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.108961 4772 scope.go:117] "RemoveContainer" containerID="e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144" Jan 27 15:27:21 crc kubenswrapper[4772]: E0127 15:27:21.110548 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144\": container with ID starting with e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144 not found: ID does not exist" containerID="e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.110591 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144"} err="failed to get container status \"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144\": rpc error: code = NotFound desc = could not find container \"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144\": container with ID starting with e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144 not found: ID does not exist" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.110618 4772 scope.go:117] "RemoveContainer" containerID="dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e" Jan 27 15:27:21 crc kubenswrapper[4772]: E0127 15:27:21.111698 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e\": container with ID starting with dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e not found: ID does not exist" containerID="dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.111723 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e"} err="failed to get container status \"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e\": rpc error: code = NotFound desc = could not find container \"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e\": container with ID starting with dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e not found: ID does not exist" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.111735 4772 scope.go:117] "RemoveContainer" containerID="e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.112507 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144"} err="failed to get container status \"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144\": rpc error: code = NotFound desc = could not find container \"e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144\": container with ID starting with e5923297bfaf90562635dde0a7a065cb6e5b62f035580ac799c013859809f144 not found: ID does not exist" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.112532 4772 scope.go:117] "RemoveContainer" containerID="dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.112768 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e"} err="failed to get container status \"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e\": rpc error: code = NotFound desc = could not find container \"dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e\": container with ID starting with dfaebe2fb36461e7df7eef945e33c09ed5b7d7616129107b7ad1d7131c28fb7e not found: ID does not exist" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.114675 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.121340 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.121444 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.134260 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.142396 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.224990 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9t4\" (UniqueName: \"kubernetes.io/projected/41f85a83-f245-40ff-b994-50cab01b2530-kube-api-access-pc9t4\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225203 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225274 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225429 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-logs\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225527 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.225610 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9t4\" (UniqueName: \"kubernetes.io/projected/41f85a83-f245-40ff-b994-50cab01b2530-kube-api-access-pc9t4\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327283 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327366 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327394 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-logs\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327429 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.327462 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.328008 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.328067 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-logs\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.328966 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.334529 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-scripts\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.334529 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.334878 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.335248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-config-data\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.347064 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9t4\" (UniqueName: \"kubernetes.io/projected/41f85a83-f245-40ff-b994-50cab01b2530-kube-api-access-pc9t4\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.353985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:27:21 crc kubenswrapper[4772]: I0127 15:27:21.485614 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.056231 4772 generic.go:334] "Generic (PLEG): container finished" podID="f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" containerID="d2b29cba9bcd684a9fa3005c73cbd809102e0bb6c21ef6ed5d53662bb4cdcdaa" exitCode=0 Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.056301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zf2tx" event={"ID":"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6","Type":"ContainerDied","Data":"d2b29cba9bcd684a9fa3005c73cbd809102e0bb6c21ef6ed5d53662bb4cdcdaa"} Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.065038 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c94a7cfa-28e2-4d52-85a1-d5586f162227","Type":"ContainerStarted","Data":"c52299828ac41e83b1686de53ba3808d1e810b20370ec9d5bc6e9bbc6b64bbed"} Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.065098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c94a7cfa-28e2-4d52-85a1-d5586f162227","Type":"ContainerStarted","Data":"65117f0b87347a480b318a709dc150116a10a8d323bd2553e117803b3054a685"} Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.696431 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.700677 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3206f0-7c01-44c9-9d6a-c586a9b25db8" path="/var/lib/kubelet/pods/bc3206f0-7c01-44c9-9d6a-c586a9b25db8/volumes" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.785805 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-credential-keys\") pod \"8322baad-60c1-4d0b-96e3-51038f2e447a\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.785901 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-scripts\") pod \"8322baad-60c1-4d0b-96e3-51038f2e447a\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.786067 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-config-data\") pod \"8322baad-60c1-4d0b-96e3-51038f2e447a\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.786125 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-968vm\" (UniqueName: \"kubernetes.io/projected/8322baad-60c1-4d0b-96e3-51038f2e447a-kube-api-access-968vm\") pod \"8322baad-60c1-4d0b-96e3-51038f2e447a\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.786210 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-fernet-keys\") pod \"8322baad-60c1-4d0b-96e3-51038f2e447a\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.786250 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-combined-ca-bundle\") pod \"8322baad-60c1-4d0b-96e3-51038f2e447a\" (UID: \"8322baad-60c1-4d0b-96e3-51038f2e447a\") " Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.791207 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8322baad-60c1-4d0b-96e3-51038f2e447a-kube-api-access-968vm" (OuterVolumeSpecName: "kube-api-access-968vm") pod "8322baad-60c1-4d0b-96e3-51038f2e447a" (UID: "8322baad-60c1-4d0b-96e3-51038f2e447a"). InnerVolumeSpecName "kube-api-access-968vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.798717 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8322baad-60c1-4d0b-96e3-51038f2e447a" (UID: "8322baad-60c1-4d0b-96e3-51038f2e447a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.804921 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-scripts" (OuterVolumeSpecName: "scripts") pod "8322baad-60c1-4d0b-96e3-51038f2e447a" (UID: "8322baad-60c1-4d0b-96e3-51038f2e447a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.810788 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8322baad-60c1-4d0b-96e3-51038f2e447a" (UID: "8322baad-60c1-4d0b-96e3-51038f2e447a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.819187 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-config-data" (OuterVolumeSpecName: "config-data") pod "8322baad-60c1-4d0b-96e3-51038f2e447a" (UID: "8322baad-60c1-4d0b-96e3-51038f2e447a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.823311 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8322baad-60c1-4d0b-96e3-51038f2e447a" (UID: "8322baad-60c1-4d0b-96e3-51038f2e447a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.887680 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.887716 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.887729 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.887738 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.887746 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8322baad-60c1-4d0b-96e3-51038f2e447a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:22 crc kubenswrapper[4772]: I0127 15:27:22.887754 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-968vm\" (UniqueName: \"kubernetes.io/projected/8322baad-60c1-4d0b-96e3-51038f2e447a-kube-api-access-968vm\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.089270 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl4nt" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.090436 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl4nt" event={"ID":"8322baad-60c1-4d0b-96e3-51038f2e447a","Type":"ContainerDied","Data":"00f64e139d84dd0a1b89ff44770726f3c1e45680dfa15727e0c2287a16f89b5d"} Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.090484 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00f64e139d84dd0a1b89ff44770726f3c1e45680dfa15727e0c2287a16f89b5d" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.183899 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-677fb7d6fc-djjsx"] Jan 27 15:27:23 crc kubenswrapper[4772]: E0127 15:27:23.184350 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8322baad-60c1-4d0b-96e3-51038f2e447a" containerName="keystone-bootstrap" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.184363 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8322baad-60c1-4d0b-96e3-51038f2e447a" containerName="keystone-bootstrap" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.184508 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8322baad-60c1-4d0b-96e3-51038f2e447a" containerName="keystone-bootstrap" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.185074 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.189302 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.189397 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.189842 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.193398 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.193616 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bdjsw" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.193772 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.198911 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-677fb7d6fc-djjsx"] Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.297909 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-config-data\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298054 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-internal-tls-certs\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298162 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-combined-ca-bundle\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-public-tls-certs\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298465 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-scripts\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298638 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksdpm\" (UniqueName: \"kubernetes.io/projected/6e790127-8223-4b0c-8a5d-21e1bb15fa30-kube-api-access-ksdpm\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298720 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-credential-keys\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.298775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-fernet-keys\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400561 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-combined-ca-bundle\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-public-tls-certs\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400695 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-scripts\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400780 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksdpm\" (UniqueName: \"kubernetes.io/projected/6e790127-8223-4b0c-8a5d-21e1bb15fa30-kube-api-access-ksdpm\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400841 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-credential-keys\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400884 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-fernet-keys\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400936 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-config-data\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.400976 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-internal-tls-certs\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.405320 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-internal-tls-certs\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.405573 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-combined-ca-bundle\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.407090 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-public-tls-certs\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.408286 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-credential-keys\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.410016 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-scripts\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.410741 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-fernet-keys\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.418292 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-config-data\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.425446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksdpm\" (UniqueName: \"kubernetes.io/projected/6e790127-8223-4b0c-8a5d-21e1bb15fa30-kube-api-access-ksdpm\") pod \"keystone-677fb7d6fc-djjsx\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.480502 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.550253 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.572993 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8d9vp"] Jan 27 15:27:23 crc kubenswrapper[4772]: I0127 15:27:23.575391 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" containerName="dnsmasq-dns" containerID="cri-o://0bc2eb78a83f1e9ddf6e0c975669640d497ff3e50951b0eaadaee82dc03caffd" gracePeriod=10 Jan 27 15:27:24 crc kubenswrapper[4772]: I0127 15:27:24.104355 4772 generic.go:334] "Generic (PLEG): container finished" podID="e329efba-60e3-49c7-81ff-b073be77e34b" containerID="0bc2eb78a83f1e9ddf6e0c975669640d497ff3e50951b0eaadaee82dc03caffd" exitCode=0 Jan 27 15:27:24 crc kubenswrapper[4772]: I0127 15:27:24.104531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" event={"ID":"e329efba-60e3-49c7-81ff-b073be77e34b","Type":"ContainerDied","Data":"0bc2eb78a83f1e9ddf6e0c975669640d497ff3e50951b0eaadaee82dc03caffd"} Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.084659 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zf2tx" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.154395 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zf2tx" event={"ID":"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6","Type":"ContainerDied","Data":"7dfb28db1e03cbbc36a413590b93de83567a6c9fa02be76267be1180098e9795"} Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.154461 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dfb28db1e03cbbc36a413590b93de83567a6c9fa02be76267be1180098e9795" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.154555 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zf2tx" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.184871 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-scripts\") pod \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.184915 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-combined-ca-bundle\") pod \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.184954 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-logs\") pod \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.185000 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-config-data\") pod \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.185097 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8x5b\" (UniqueName: \"kubernetes.io/projected/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-kube-api-access-d8x5b\") pod \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\" (UID: \"f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.186214 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-logs" (OuterVolumeSpecName: "logs") pod "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" (UID: "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.205444 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-scripts" (OuterVolumeSpecName: "scripts") pod "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" (UID: "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.205497 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-kube-api-access-d8x5b" (OuterVolumeSpecName: "kube-api-access-d8x5b") pod "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" (UID: "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6"). InnerVolumeSpecName "kube-api-access-d8x5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.226573 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-config-data" (OuterVolumeSpecName: "config-data") pod "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" (UID: "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.233121 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" (UID: "f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.287633 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.287664 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.287676 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.287686 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.287694 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8x5b\" (UniqueName: \"kubernetes.io/projected/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6-kube-api-access-d8x5b\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.326396 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.490073 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-sb\") pod \"e329efba-60e3-49c7-81ff-b073be77e34b\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.490283 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-svc\") pod \"e329efba-60e3-49c7-81ff-b073be77e34b\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.490671 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-config\") pod \"e329efba-60e3-49c7-81ff-b073be77e34b\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.490713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlv8\" (UniqueName: \"kubernetes.io/projected/e329efba-60e3-49c7-81ff-b073be77e34b-kube-api-access-zmlv8\") pod \"e329efba-60e3-49c7-81ff-b073be77e34b\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.490741 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-nb\") pod \"e329efba-60e3-49c7-81ff-b073be77e34b\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.491063 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-swift-storage-0\") pod \"e329efba-60e3-49c7-81ff-b073be77e34b\" (UID: \"e329efba-60e3-49c7-81ff-b073be77e34b\") " Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.512197 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e329efba-60e3-49c7-81ff-b073be77e34b-kube-api-access-zmlv8" (OuterVolumeSpecName: "kube-api-access-zmlv8") pod "e329efba-60e3-49c7-81ff-b073be77e34b" (UID: "e329efba-60e3-49c7-81ff-b073be77e34b"). InnerVolumeSpecName "kube-api-access-zmlv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.527867 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-677fb7d6fc-djjsx"] Jan 27 15:27:27 crc kubenswrapper[4772]: W0127 15:27:27.533725 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e790127_8223_4b0c_8a5d_21e1bb15fa30.slice/crio-8b30129bf5b3504ae600edeaafe66f62d6f0c11b788461d423310f03199da3c5 WatchSource:0}: Error finding container 8b30129bf5b3504ae600edeaafe66f62d6f0c11b788461d423310f03199da3c5: Status 404 returned error can't find the container with id 8b30129bf5b3504ae600edeaafe66f62d6f0c11b788461d423310f03199da3c5 Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.566458 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e329efba-60e3-49c7-81ff-b073be77e34b" (UID: "e329efba-60e3-49c7-81ff-b073be77e34b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.581258 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e329efba-60e3-49c7-81ff-b073be77e34b" (UID: "e329efba-60e3-49c7-81ff-b073be77e34b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.581272 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e329efba-60e3-49c7-81ff-b073be77e34b" (UID: "e329efba-60e3-49c7-81ff-b073be77e34b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.582113 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e329efba-60e3-49c7-81ff-b073be77e34b" (UID: "e329efba-60e3-49c7-81ff-b073be77e34b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.593305 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.593338 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.593348 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.593356 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmlv8\" (UniqueName: \"kubernetes.io/projected/e329efba-60e3-49c7-81ff-b073be77e34b-kube-api-access-zmlv8\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.593367 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.599448 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-config" (OuterVolumeSpecName: "config") pod "e329efba-60e3-49c7-81ff-b073be77e34b" (UID: "e329efba-60e3-49c7-81ff-b073be77e34b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.601381 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:27:27 crc kubenswrapper[4772]: W0127 15:27:27.604214 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f85a83_f245_40ff_b994_50cab01b2530.slice/crio-70507e561102278ed4f801ac168676eb09960026059c30c45c5fe4950449c589 WatchSource:0}: Error finding container 70507e561102278ed4f801ac168676eb09960026059c30c45c5fe4950449c589: Status 404 returned error can't find the container with id 70507e561102278ed4f801ac168676eb09960026059c30c45c5fe4950449c589 Jan 27 15:27:27 crc kubenswrapper[4772]: I0127 15:27:27.694875 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e329efba-60e3-49c7-81ff-b073be77e34b-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.182082 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.182085 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-8d9vp" event={"ID":"e329efba-60e3-49c7-81ff-b073be77e34b","Type":"ContainerDied","Data":"dc924b979634b0cd0c7264ffb70a5a244bf22da4a19a82562f283a10b69f4841"} Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.182317 4772 scope.go:117] "RemoveContainer" containerID="0bc2eb78a83f1e9ddf6e0c975669640d497ff3e50951b0eaadaee82dc03caffd" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.185226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41f85a83-f245-40ff-b994-50cab01b2530","Type":"ContainerStarted","Data":"70507e561102278ed4f801ac168676eb09960026059c30c45c5fe4950449c589"} Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.187373 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c94a7cfa-28e2-4d52-85a1-d5586f162227","Type":"ContainerStarted","Data":"9775d2c5b4eda3cae695814a686a4a82d4426bf3d7d28a73dffa9b807c4c16b8"} Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.190400 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-677fb7d6fc-djjsx" event={"ID":"6e790127-8223-4b0c-8a5d-21e1bb15fa30","Type":"ContainerStarted","Data":"468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc"} Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.190469 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-677fb7d6fc-djjsx" event={"ID":"6e790127-8223-4b0c-8a5d-21e1bb15fa30","Type":"ContainerStarted","Data":"8b30129bf5b3504ae600edeaafe66f62d6f0c11b788461d423310f03199da3c5"} Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.190636 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.232896 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-677fb7d6fc-djjsx" podStartSLOduration=5.232864742 podStartE2EDuration="5.232864742s" podCreationTimestamp="2026-01-27 15:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:28.224391365 +0000 UTC m=+1234.205000483" watchObservedRunningTime="2026-01-27 15:27:28.232864742 +0000 UTC m=+1234.213473840" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.251896 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-597699949b-q6msx"] Jan 27 15:27:28 crc kubenswrapper[4772]: E0127 15:27:28.252351 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" containerName="placement-db-sync" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.252376 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" containerName="placement-db-sync" Jan 27 15:27:28 crc kubenswrapper[4772]: E0127 15:27:28.252391 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" containerName="init" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.252397 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" containerName="init" Jan 27 15:27:28 crc kubenswrapper[4772]: E0127 15:27:28.252419 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" containerName="dnsmasq-dns" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.252424 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" containerName="dnsmasq-dns" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.252650 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" containerName="placement-db-sync" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.252677 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" containerName="dnsmasq-dns" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.253813 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.255919 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.255950 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.258043 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-4tg2g" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.258352 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.259238 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.259213538000001 podStartE2EDuration="8.259213538s" podCreationTimestamp="2026-01-27 15:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:28.253162022 +0000 UTC m=+1234.233771130" watchObservedRunningTime="2026-01-27 15:27:28.259213538 +0000 UTC m=+1234.239822646" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.260216 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.285924 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-597699949b-q6msx"] Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.304310 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8d9vp"] Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.314488 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-8d9vp"] Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409106 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-internal-tls-certs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409235 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-public-tls-certs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409267 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-scripts\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-combined-ca-bundle\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409322 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknks\" (UniqueName: \"kubernetes.io/projected/4205dfea-7dc7-496a-9745-fc5e3d0a418a-kube-api-access-fknks\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4205dfea-7dc7-496a-9745-fc5e3d0a418a-logs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.409831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-config-data\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514650 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-config-data\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514738 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-internal-tls-certs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514779 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-public-tls-certs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514805 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-scripts\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-combined-ca-bundle\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknks\" (UniqueName: \"kubernetes.io/projected/4205dfea-7dc7-496a-9745-fc5e3d0a418a-kube-api-access-fknks\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.514939 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4205dfea-7dc7-496a-9745-fc5e3d0a418a-logs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.517944 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4205dfea-7dc7-496a-9745-fc5e3d0a418a-logs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.519001 4772 scope.go:117] "RemoveContainer" containerID="10196ecd014a671d2bb0c35a007cf89f6cc32f81e1a8290e8f8bb5f8f7575614" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.522395 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-public-tls-certs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.522915 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-scripts\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.524310 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-internal-tls-certs\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.527982 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-config-data\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.528875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-combined-ca-bundle\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.532118 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknks\" (UniqueName: \"kubernetes.io/projected/4205dfea-7dc7-496a-9745-fc5e3d0a418a-kube-api-access-fknks\") pod \"placement-597699949b-q6msx\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.593617 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:28 crc kubenswrapper[4772]: I0127 15:27:28.683475 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e329efba-60e3-49c7-81ff-b073be77e34b" path="/var/lib/kubelet/pods/e329efba-60e3-49c7-81ff-b073be77e34b/volumes" Jan 27 15:27:29 crc kubenswrapper[4772]: I0127 15:27:29.201899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41f85a83-f245-40ff-b994-50cab01b2530","Type":"ContainerStarted","Data":"58d128e4a7f44cc529be47e9f224989cce3b8a08dc4e4f4d37d49e38c0c7b8d2"} Jan 27 15:27:29 crc kubenswrapper[4772]: I0127 15:27:29.204372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerStarted","Data":"b87da5e7b978350e6830e0f65fce50644eee1e1665a4ebcd45d4d0010f0f31d7"} Jan 27 15:27:29 crc kubenswrapper[4772]: I0127 15:27:29.303090 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-597699949b-q6msx"] Jan 27 15:27:29 crc kubenswrapper[4772]: W0127 15:27:29.353274 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4205dfea_7dc7_496a_9745_fc5e3d0a418a.slice/crio-77708c49aaa66488bf09da947ac24b469a4cd3c49071689cbd09cfa6aa9b79b5 WatchSource:0}: Error finding container 77708c49aaa66488bf09da947ac24b469a4cd3c49071689cbd09cfa6aa9b79b5: Status 404 returned error can't find the container with id 77708c49aaa66488bf09da947ac24b469a4cd3c49071689cbd09cfa6aa9b79b5 Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.213964 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8l85z" event={"ID":"9ae05919-68bf-43d1-abd9-9908ec287bd0","Type":"ContainerStarted","Data":"f581dd644d182efa5f740dc0b5a2f4adfb865bef3f027972802161889179f1d4"} Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.227208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-597699949b-q6msx" event={"ID":"4205dfea-7dc7-496a-9745-fc5e3d0a418a","Type":"ContainerStarted","Data":"ad26ca4835a223df0b0aa3065e02d9e54b67030d2b6d0436f1f1a0dd7bf06415"} Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.227251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-597699949b-q6msx" event={"ID":"4205dfea-7dc7-496a-9745-fc5e3d0a418a","Type":"ContainerStarted","Data":"f10ed54f4ea68e56be83b8d8387a9768612b5c035b1fc42928132066af5bd689"} Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.227264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-597699949b-q6msx" event={"ID":"4205dfea-7dc7-496a-9745-fc5e3d0a418a","Type":"ContainerStarted","Data":"77708c49aaa66488bf09da947ac24b469a4cd3c49071689cbd09cfa6aa9b79b5"} Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.227825 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.227854 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-597699949b-q6msx" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.233710 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmk27" event={"ID":"5a423229-06be-4934-9715-58105e1af686","Type":"ContainerStarted","Data":"6ba95c7bf22c812cf8d7d855d86c702f5f7f90db05ec7fc2281ddec549f7d67b"} Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.236435 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41f85a83-f245-40ff-b994-50cab01b2530","Type":"ContainerStarted","Data":"f1accbd1db4a8c2dce7512a2eb2abaa265e29ed373b0fc121d29515c5bba0e55"} Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.251096 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-8l85z" podStartSLOduration=2.19845139 podStartE2EDuration="42.251073127s" podCreationTimestamp="2026-01-27 15:26:48 +0000 UTC" firstStartedPulling="2026-01-27 15:26:49.310600793 +0000 UTC m=+1195.291209891" lastFinishedPulling="2026-01-27 15:27:29.36322253 +0000 UTC m=+1235.343831628" observedRunningTime="2026-01-27 15:27:30.24258258 +0000 UTC m=+1236.223191678" watchObservedRunningTime="2026-01-27 15:27:30.251073127 +0000 UTC m=+1236.231682225" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.277072 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-597699949b-q6msx" podStartSLOduration=2.277054472 podStartE2EDuration="2.277054472s" podCreationTimestamp="2026-01-27 15:27:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:30.269883874 +0000 UTC m=+1236.250492982" watchObservedRunningTime="2026-01-27 15:27:30.277054472 +0000 UTC m=+1236.257663570" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.302619 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pmk27" podStartSLOduration=2.699131315 podStartE2EDuration="43.302595605s" podCreationTimestamp="2026-01-27 15:26:47 +0000 UTC" firstStartedPulling="2026-01-27 15:26:49.217989205 +0000 UTC m=+1195.198598303" lastFinishedPulling="2026-01-27 15:27:29.821453495 +0000 UTC m=+1235.802062593" observedRunningTime="2026-01-27 15:27:30.292606624 +0000 UTC m=+1236.273215732" watchObservedRunningTime="2026-01-27 15:27:30.302595605 +0000 UTC m=+1236.283204703" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.313528 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.313517882 podStartE2EDuration="9.313517882s" podCreationTimestamp="2026-01-27 15:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:30.311067241 +0000 UTC m=+1236.291676349" watchObservedRunningTime="2026-01-27 15:27:30.313517882 +0000 UTC m=+1236.294126980" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.767040 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.767101 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.804250 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 15:27:30 crc kubenswrapper[4772]: I0127 15:27:30.819293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 15:27:31 crc kubenswrapper[4772]: I0127 15:27:31.256911 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 15:27:31 crc kubenswrapper[4772]: I0127 15:27:31.256952 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 15:27:31 crc kubenswrapper[4772]: I0127 15:27:31.485830 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:31 crc kubenswrapper[4772]: I0127 15:27:31.486070 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:31 crc kubenswrapper[4772]: I0127 15:27:31.522874 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:31 crc kubenswrapper[4772]: I0127 15:27:31.537126 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:32 crc kubenswrapper[4772]: I0127 15:27:32.264550 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:32 crc kubenswrapper[4772]: I0127 15:27:32.264879 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:33 crc kubenswrapper[4772]: I0127 15:27:33.211950 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 15:27:34 crc kubenswrapper[4772]: I0127 15:27:34.166558 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.333053 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerStarted","Data":"03f8da2d80772e659c36db9a1b10a6be24dc704eb86ce89c04a5a14351b7726d"} Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.333545 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-central-agent" containerID="cri-o://f76b5eae8b9d1fd746edffe9a9f5a02ca0ad4ea09665e63c5dbeacff4753fa40" gracePeriod=30 Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.333891 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.334054 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="proxy-httpd" containerID="cri-o://03f8da2d80772e659c36db9a1b10a6be24dc704eb86ce89c04a5a14351b7726d" gracePeriod=30 Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.334274 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="sg-core" containerID="cri-o://b87da5e7b978350e6830e0f65fce50644eee1e1665a4ebcd45d4d0010f0f31d7" gracePeriod=30 Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.334377 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-notification-agent" containerID="cri-o://ed51d0aa4ae1c7166bbf0464f2b405f79a0faa50f99c4244c9717d1a1fd81db2" gracePeriod=30 Jan 27 15:27:39 crc kubenswrapper[4772]: I0127 15:27:39.371826 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.163481096 podStartE2EDuration="52.371798099s" podCreationTimestamp="2026-01-27 15:26:47 +0000 UTC" firstStartedPulling="2026-01-27 15:26:49.076431467 +0000 UTC m=+1195.057040565" lastFinishedPulling="2026-01-27 15:27:38.28474847 +0000 UTC m=+1244.265357568" observedRunningTime="2026-01-27 15:27:39.363124797 +0000 UTC m=+1245.343733985" watchObservedRunningTime="2026-01-27 15:27:39.371798099 +0000 UTC m=+1245.352407217" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.343985 4772 generic.go:334] "Generic (PLEG): container finished" podID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerID="03f8da2d80772e659c36db9a1b10a6be24dc704eb86ce89c04a5a14351b7726d" exitCode=0 Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344469 4772 generic.go:334] "Generic (PLEG): container finished" podID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerID="b87da5e7b978350e6830e0f65fce50644eee1e1665a4ebcd45d4d0010f0f31d7" exitCode=2 Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344478 4772 generic.go:334] "Generic (PLEG): container finished" podID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerID="ed51d0aa4ae1c7166bbf0464f2b405f79a0faa50f99c4244c9717d1a1fd81db2" exitCode=0 Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344486 4772 generic.go:334] "Generic (PLEG): container finished" podID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerID="f76b5eae8b9d1fd746edffe9a9f5a02ca0ad4ea09665e63c5dbeacff4753fa40" exitCode=0 Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344079 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerDied","Data":"03f8da2d80772e659c36db9a1b10a6be24dc704eb86ce89c04a5a14351b7726d"} Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344518 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerDied","Data":"b87da5e7b978350e6830e0f65fce50644eee1e1665a4ebcd45d4d0010f0f31d7"} Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerDied","Data":"ed51d0aa4ae1c7166bbf0464f2b405f79a0faa50f99c4244c9717d1a1fd81db2"} Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344541 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerDied","Data":"f76b5eae8b9d1fd746edffe9a9f5a02ca0ad4ea09665e63c5dbeacff4753fa40"} Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344550 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de415c6e-4424-49c4-bc9d-076a5b13ab4e","Type":"ContainerDied","Data":"a0254acb416eb806ca40cead3274ef3b55185c0cdbabec25da60a2a08040318a"} Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.344559 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0254acb416eb806ca40cead3274ef3b55185c0cdbabec25da60a2a08040318a" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.362482 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461586 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-combined-ca-bundle\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461668 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-config-data\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461796 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-scripts\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461839 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-sg-core-conf-yaml\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461887 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-run-httpd\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461916 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvxkw\" (UniqueName: \"kubernetes.io/projected/de415c6e-4424-49c4-bc9d-076a5b13ab4e-kube-api-access-fvxkw\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.461978 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-log-httpd\") pod \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\" (UID: \"de415c6e-4424-49c4-bc9d-076a5b13ab4e\") " Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.462371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.462485 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.463209 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.467458 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de415c6e-4424-49c4-bc9d-076a5b13ab4e-kube-api-access-fvxkw" (OuterVolumeSpecName: "kube-api-access-fvxkw") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "kube-api-access-fvxkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.468034 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-scripts" (OuterVolumeSpecName: "scripts") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.493715 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.525470 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.547092 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-config-data" (OuterVolumeSpecName: "config-data") pod "de415c6e-4424-49c4-bc9d-076a5b13ab4e" (UID: "de415c6e-4424-49c4-bc9d-076a5b13ab4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.564289 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de415c6e-4424-49c4-bc9d-076a5b13ab4e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.564369 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.564380 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.564389 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.564400 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de415c6e-4424-49c4-bc9d-076a5b13ab4e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:40 crc kubenswrapper[4772]: I0127 15:27:40.564408 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvxkw\" (UniqueName: \"kubernetes.io/projected/de415c6e-4424-49c4-bc9d-076a5b13ab4e-kube-api-access-fvxkw\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.352524 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.376939 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.383825 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.394975 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:41 crc kubenswrapper[4772]: E0127 15:27:41.395380 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-central-agent" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395407 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-central-agent" Jan 27 15:27:41 crc kubenswrapper[4772]: E0127 15:27:41.395421 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="sg-core" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395430 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="sg-core" Jan 27 15:27:41 crc kubenswrapper[4772]: E0127 15:27:41.395459 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="proxy-httpd" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395467 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="proxy-httpd" Jan 27 15:27:41 crc kubenswrapper[4772]: E0127 15:27:41.395489 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-notification-agent" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395498 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-notification-agent" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395722 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="sg-core" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395769 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-central-agent" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395779 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="proxy-httpd" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.395813 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" containerName="ceilometer-notification-agent" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.397669 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.399668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.400276 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.410151 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.477517 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-scripts\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.477951 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-log-httpd\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.477994 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.478019 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnbp\" (UniqueName: \"kubernetes.io/projected/d8c7b9cc-5427-4c3a-92d7-cec9760975df-kube-api-access-tnnbp\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.478121 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.478318 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-config-data\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.478400 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-run-httpd\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.488256 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:41 crc kubenswrapper[4772]: E0127 15:27:41.489005 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-tnnbp log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="d8c7b9cc-5427-4c3a-92d7-cec9760975df" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580343 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-config-data\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-run-httpd\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580444 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-scripts\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580467 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-log-httpd\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580497 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnbp\" (UniqueName: \"kubernetes.io/projected/d8c7b9cc-5427-4c3a-92d7-cec9760975df-kube-api-access-tnnbp\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580515 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.580549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.581101 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-log-httpd\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.581355 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-run-httpd\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.585787 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.587073 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-config-data\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.588018 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-scripts\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.589134 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:41 crc kubenswrapper[4772]: I0127 15:27:41.597202 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnbp\" (UniqueName: \"kubernetes.io/projected/d8c7b9cc-5427-4c3a-92d7-cec9760975df-kube-api-access-tnnbp\") pod \"ceilometer-0\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " pod="openstack/ceilometer-0" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.361202 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.371445 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495279 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-sg-core-conf-yaml\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495410 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-scripts\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495447 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-config-data\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495497 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-log-httpd\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495568 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnnbp\" (UniqueName: \"kubernetes.io/projected/d8c7b9cc-5427-4c3a-92d7-cec9760975df-kube-api-access-tnnbp\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495619 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-combined-ca-bundle\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-run-httpd\") pod \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\" (UID: \"d8c7b9cc-5427-4c3a-92d7-cec9760975df\") " Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.495980 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.496088 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.496306 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.496321 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8c7b9cc-5427-4c3a-92d7-cec9760975df-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.499357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.499469 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c7b9cc-5427-4c3a-92d7-cec9760975df-kube-api-access-tnnbp" (OuterVolumeSpecName: "kube-api-access-tnnbp") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "kube-api-access-tnnbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.499861 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-config-data" (OuterVolumeSpecName: "config-data") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.500955 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-scripts" (OuterVolumeSpecName: "scripts") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.502667 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8c7b9cc-5427-4c3a-92d7-cec9760975df" (UID: "d8c7b9cc-5427-4c3a-92d7-cec9760975df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.597292 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnnbp\" (UniqueName: \"kubernetes.io/projected/d8c7b9cc-5427-4c3a-92d7-cec9760975df-kube-api-access-tnnbp\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.597321 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.597330 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.597338 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.597346 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c7b9cc-5427-4c3a-92d7-cec9760975df-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:42 crc kubenswrapper[4772]: I0127 15:27:42.676347 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de415c6e-4424-49c4-bc9d-076a5b13ab4e" path="/var/lib/kubelet/pods/de415c6e-4424-49c4-bc9d-076a5b13ab4e/volumes" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.372656 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.421544 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.432682 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.442279 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.445246 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.447290 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.451333 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.460357 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.511964 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.512025 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-config-data\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.512081 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjsh\" (UniqueName: \"kubernetes.io/projected/f8583377-67ef-4cca-83bb-08d7523ab0a8-kube-api-access-9jjsh\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.512114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.512143 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-scripts\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.512273 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-run-httpd\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.512382 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-log-httpd\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.613680 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-run-httpd\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.613804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-log-httpd\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.613886 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.613934 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-config-data\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.614018 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjsh\" (UniqueName: \"kubernetes.io/projected/f8583377-67ef-4cca-83bb-08d7523ab0a8-kube-api-access-9jjsh\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.614084 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.614161 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-scripts\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.614324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-run-httpd\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.614844 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-log-httpd\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.620115 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.620208 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-config-data\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.626003 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-scripts\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.628207 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.640354 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjsh\" (UniqueName: \"kubernetes.io/projected/f8583377-67ef-4cca-83bb-08d7523ab0a8-kube-api-access-9jjsh\") pod \"ceilometer-0\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " pod="openstack/ceilometer-0" Jan 27 15:27:43 crc kubenswrapper[4772]: I0127 15:27:43.777553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:27:44 crc kubenswrapper[4772]: I0127 15:27:44.251349 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:27:44 crc kubenswrapper[4772]: I0127 15:27:44.382526 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerStarted","Data":"561a737e7b03fa9b4f9fd3fd05ad062a1b5523a6f22e1965b09d595f10adafd3"} Jan 27 15:27:44 crc kubenswrapper[4772]: I0127 15:27:44.675520 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c7b9cc-5427-4c3a-92d7-cec9760975df" path="/var/lib/kubelet/pods/d8c7b9cc-5427-4c3a-92d7-cec9760975df/volumes" Jan 27 15:27:45 crc kubenswrapper[4772]: I0127 15:27:45.393064 4772 generic.go:334] "Generic (PLEG): container finished" podID="5a423229-06be-4934-9715-58105e1af686" containerID="6ba95c7bf22c812cf8d7d855d86c702f5f7f90db05ec7fc2281ddec549f7d67b" exitCode=0 Jan 27 15:27:45 crc kubenswrapper[4772]: I0127 15:27:45.393109 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmk27" event={"ID":"5a423229-06be-4934-9715-58105e1af686","Type":"ContainerDied","Data":"6ba95c7bf22c812cf8d7d855d86c702f5f7f90db05ec7fc2281ddec549f7d67b"} Jan 27 15:27:45 crc kubenswrapper[4772]: I0127 15:27:45.395117 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerStarted","Data":"941b08834b6f5b8dafbc182c67d3e458a94c7299ea32b8afd698f876b68ea015"} Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.409858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerStarted","Data":"99f37d09a547a41878834ffbee7e0a0b90552016b42313fae983e9915266d761"} Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.746584 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmk27" Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.773589 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-combined-ca-bundle\") pod \"5a423229-06be-4934-9715-58105e1af686\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.773654 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-db-sync-config-data\") pod \"5a423229-06be-4934-9715-58105e1af686\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.773691 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cvd2\" (UniqueName: \"kubernetes.io/projected/5a423229-06be-4934-9715-58105e1af686-kube-api-access-8cvd2\") pod \"5a423229-06be-4934-9715-58105e1af686\" (UID: \"5a423229-06be-4934-9715-58105e1af686\") " Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.790834 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5a423229-06be-4934-9715-58105e1af686" (UID: "5a423229-06be-4934-9715-58105e1af686"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.793053 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a423229-06be-4934-9715-58105e1af686-kube-api-access-8cvd2" (OuterVolumeSpecName: "kube-api-access-8cvd2") pod "5a423229-06be-4934-9715-58105e1af686" (UID: "5a423229-06be-4934-9715-58105e1af686"). InnerVolumeSpecName "kube-api-access-8cvd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.807362 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a423229-06be-4934-9715-58105e1af686" (UID: "5a423229-06be-4934-9715-58105e1af686"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.876349 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.877020 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5a423229-06be-4934-9715-58105e1af686-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:46 crc kubenswrapper[4772]: I0127 15:27:46.877056 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cvd2\" (UniqueName: \"kubernetes.io/projected/5a423229-06be-4934-9715-58105e1af686-kube-api-access-8cvd2\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.420025 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerStarted","Data":"ea8686b5fbb3cb04fd3d0cb81bec48b421aa5e6e9be9af4a4ad0ccc951c6bce4"} Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.421787 4772 generic.go:334] "Generic (PLEG): container finished" podID="9ae05919-68bf-43d1-abd9-9908ec287bd0" containerID="f581dd644d182efa5f740dc0b5a2f4adfb865bef3f027972802161889179f1d4" exitCode=0 Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.421859 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8l85z" event={"ID":"9ae05919-68bf-43d1-abd9-9908ec287bd0","Type":"ContainerDied","Data":"f581dd644d182efa5f740dc0b5a2f4adfb865bef3f027972802161889179f1d4"} Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.424700 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pmk27" event={"ID":"5a423229-06be-4934-9715-58105e1af686","Type":"ContainerDied","Data":"604f080c5b545eb272e78d6599f0497ec22c32b54d41f8331dbefcd9a29b19de"} Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.424800 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="604f080c5b545eb272e78d6599f0497ec22c32b54d41f8331dbefcd9a29b19de" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.424756 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pmk27" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.682892 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6748df9c8c-zk7zp"] Jan 27 15:27:47 crc kubenswrapper[4772]: E0127 15:27:47.683309 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a423229-06be-4934-9715-58105e1af686" containerName="barbican-db-sync" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.683326 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a423229-06be-4934-9715-58105e1af686" containerName="barbican-db-sync" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.683551 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a423229-06be-4934-9715-58105e1af686" containerName="barbican-db-sync" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.684531 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.692971 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-556764fb84-r628x"] Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.694789 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.698416 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.698641 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-flljj" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.698812 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.699446 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.708056 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-combined-ca-bundle\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.708135 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmsl\" (UniqueName: \"kubernetes.io/projected/4ce27714-673f-47de-acc3-b6902b534bdd-kube-api-access-pmmsl\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.708198 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.708248 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data-custom\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.708277 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce27714-673f-47de-acc3-b6902b534bdd-logs\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.720538 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6748df9c8c-zk7zp"] Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.739089 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-556764fb84-r628x"] Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.788104 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-62kx4"] Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.789689 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.803742 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-62kx4"] Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815362 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815422 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data-custom\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815465 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce27714-673f-47de-acc3-b6902b534bdd-logs\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815493 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-combined-ca-bundle\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815600 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-combined-ca-bundle\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815620 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710edaa6-ba83-4b1f-a49a-769ca1911c9b-logs\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815645 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815662 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvs49\" (UniqueName: \"kubernetes.io/projected/710edaa6-ba83-4b1f-a49a-769ca1911c9b-kube-api-access-nvs49\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815699 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data-custom\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.815719 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmsl\" (UniqueName: \"kubernetes.io/projected/4ce27714-673f-47de-acc3-b6902b534bdd-kube-api-access-pmmsl\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.816743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce27714-673f-47de-acc3-b6902b534bdd-logs\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.821366 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.830033 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-combined-ca-bundle\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.830705 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data-custom\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.856907 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmsl\" (UniqueName: \"kubernetes.io/projected/4ce27714-673f-47de-acc3-b6902b534bdd-kube-api-access-pmmsl\") pod \"barbican-keystone-listener-556764fb84-r628x\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.920464 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710edaa6-ba83-4b1f-a49a-769ca1911c9b-logs\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.920897 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.920920 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvs49\" (UniqueName: \"kubernetes.io/projected/710edaa6-ba83-4b1f-a49a-769ca1911c9b-kube-api-access-nvs49\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.920919 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710edaa6-ba83-4b1f-a49a-769ca1911c9b-logs\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.921006 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data-custom\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.921056 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.921084 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxqs\" (UniqueName: \"kubernetes.io/projected/eca3ffef-1a57-4aee-9302-64b59ee0fc44-kube-api-access-4vxqs\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.921218 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.922731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.922788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-combined-ca-bundle\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.922842 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-config\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.922928 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.926666 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data-custom\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.926733 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-combined-ca-bundle\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.936750 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.953184 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvs49\" (UniqueName: \"kubernetes.io/projected/710edaa6-ba83-4b1f-a49a-769ca1911c9b-kube-api-access-nvs49\") pod \"barbican-worker-6748df9c8c-zk7zp\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.979588 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cb9d976b-flrwl"] Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.981216 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.987142 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 27 15:27:47 crc kubenswrapper[4772]: I0127 15:27:47.997436 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cb9d976b-flrwl"] Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.025304 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.025357 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxqs\" (UniqueName: \"kubernetes.io/projected/eca3ffef-1a57-4aee-9302-64b59ee0fc44-kube-api-access-4vxqs\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.025444 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.025506 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.025546 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-config\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.025597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.026661 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.027306 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.028216 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.028805 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.029534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-config\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.029893 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.045593 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.049980 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxqs\" (UniqueName: \"kubernetes.io/projected/eca3ffef-1a57-4aee-9302-64b59ee0fc44-kube-api-access-4vxqs\") pod \"dnsmasq-dns-59d5ff467f-62kx4\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.127564 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.127607 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbjvx\" (UniqueName: \"kubernetes.io/projected/a02a1b6c-d438-42bf-a577-88bbbcca2a00-kube-api-access-rbjvx\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.128009 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data-custom\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.128118 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-combined-ca-bundle\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.128275 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a02a1b6c-d438-42bf-a577-88bbbcca2a00-logs\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.217266 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.229672 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data-custom\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.229741 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-combined-ca-bundle\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.229797 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a02a1b6c-d438-42bf-a577-88bbbcca2a00-logs\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.229857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.229888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbjvx\" (UniqueName: \"kubernetes.io/projected/a02a1b6c-d438-42bf-a577-88bbbcca2a00-kube-api-access-rbjvx\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.233743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a02a1b6c-d438-42bf-a577-88bbbcca2a00-logs\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.234865 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data-custom\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.237989 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.243837 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-combined-ca-bundle\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.250081 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbjvx\" (UniqueName: \"kubernetes.io/projected/a02a1b6c-d438-42bf-a577-88bbbcca2a00-kube-api-access-rbjvx\") pod \"barbican-api-cb9d976b-flrwl\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.297944 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.465903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerStarted","Data":"851c7b3936a50d21408fbed0918adde539924e5915ec73fdcccd952a3392565b"} Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.466073 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.498245 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.003279171 podStartE2EDuration="5.498219276s" podCreationTimestamp="2026-01-27 15:27:43 +0000 UTC" firstStartedPulling="2026-01-27 15:27:44.258624969 +0000 UTC m=+1250.239234067" lastFinishedPulling="2026-01-27 15:27:47.753565074 +0000 UTC m=+1253.734174172" observedRunningTime="2026-01-27 15:27:48.490723209 +0000 UTC m=+1254.471332337" watchObservedRunningTime="2026-01-27 15:27:48.498219276 +0000 UTC m=+1254.478828384" Jan 27 15:27:48 crc kubenswrapper[4772]: W0127 15:27:48.622793 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod710edaa6_ba83_4b1f_a49a_769ca1911c9b.slice/crio-c83991847bf683630e70d44722d44695d9152a02d09f0a3d6fe39436ebbf262d WatchSource:0}: Error finding container c83991847bf683630e70d44722d44695d9152a02d09f0a3d6fe39436ebbf262d: Status 404 returned error can't find the container with id c83991847bf683630e70d44722d44695d9152a02d09f0a3d6fe39436ebbf262d Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.623106 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6748df9c8c-zk7zp"] Jan 27 15:27:48 crc kubenswrapper[4772]: W0127 15:27:48.690665 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce27714_673f_47de_acc3_b6902b534bdd.slice/crio-51e9e5e71be46820f9c3d1564ff14b9e6df8988ed057a1326779c07f7fee3331 WatchSource:0}: Error finding container 51e9e5e71be46820f9c3d1564ff14b9e6df8988ed057a1326779c07f7fee3331: Status 404 returned error can't find the container with id 51e9e5e71be46820f9c3d1564ff14b9e6df8988ed057a1326779c07f7fee3331 Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.694419 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-556764fb84-r628x"] Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.770789 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-62kx4"] Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.893259 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8l85z" Jan 27 15:27:48 crc kubenswrapper[4772]: I0127 15:27:48.953940 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cb9d976b-flrwl"] Jan 27 15:27:48 crc kubenswrapper[4772]: W0127 15:27:48.964126 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda02a1b6c_d438_42bf_a577_88bbbcca2a00.slice/crio-37bb9f56f01e20e1b9f2066e0e10a19c1b316c11b490381320b663b46a9cc874 WatchSource:0}: Error finding container 37bb9f56f01e20e1b9f2066e0e10a19c1b316c11b490381320b663b46a9cc874: Status 404 returned error can't find the container with id 37bb9f56f01e20e1b9f2066e0e10a19c1b316c11b490381320b663b46a9cc874 Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.051328 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-scripts\") pod \"9ae05919-68bf-43d1-abd9-9908ec287bd0\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.051539 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tmg\" (UniqueName: \"kubernetes.io/projected/9ae05919-68bf-43d1-abd9-9908ec287bd0-kube-api-access-d4tmg\") pod \"9ae05919-68bf-43d1-abd9-9908ec287bd0\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.051814 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-config-data\") pod \"9ae05919-68bf-43d1-abd9-9908ec287bd0\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.051913 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-combined-ca-bundle\") pod \"9ae05919-68bf-43d1-abd9-9908ec287bd0\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.051959 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ae05919-68bf-43d1-abd9-9908ec287bd0-etc-machine-id\") pod \"9ae05919-68bf-43d1-abd9-9908ec287bd0\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.052013 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-db-sync-config-data\") pod \"9ae05919-68bf-43d1-abd9-9908ec287bd0\" (UID: \"9ae05919-68bf-43d1-abd9-9908ec287bd0\") " Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.052058 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9ae05919-68bf-43d1-abd9-9908ec287bd0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9ae05919-68bf-43d1-abd9-9908ec287bd0" (UID: "9ae05919-68bf-43d1-abd9-9908ec287bd0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.052763 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9ae05919-68bf-43d1-abd9-9908ec287bd0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.057095 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-scripts" (OuterVolumeSpecName: "scripts") pod "9ae05919-68bf-43d1-abd9-9908ec287bd0" (UID: "9ae05919-68bf-43d1-abd9-9908ec287bd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.057285 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9ae05919-68bf-43d1-abd9-9908ec287bd0" (UID: "9ae05919-68bf-43d1-abd9-9908ec287bd0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.059018 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ae05919-68bf-43d1-abd9-9908ec287bd0-kube-api-access-d4tmg" (OuterVolumeSpecName: "kube-api-access-d4tmg") pod "9ae05919-68bf-43d1-abd9-9908ec287bd0" (UID: "9ae05919-68bf-43d1-abd9-9908ec287bd0"). InnerVolumeSpecName "kube-api-access-d4tmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.083754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ae05919-68bf-43d1-abd9-9908ec287bd0" (UID: "9ae05919-68bf-43d1-abd9-9908ec287bd0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.110442 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-config-data" (OuterVolumeSpecName: "config-data") pod "9ae05919-68bf-43d1-abd9-9908ec287bd0" (UID: "9ae05919-68bf-43d1-abd9-9908ec287bd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.153906 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.153943 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.153959 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.153999 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ae05919-68bf-43d1-abd9-9908ec287bd0-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.154011 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4tmg\" (UniqueName: \"kubernetes.io/projected/9ae05919-68bf-43d1-abd9-9908ec287bd0-kube-api-access-d4tmg\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.476213 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6748df9c8c-zk7zp" event={"ID":"710edaa6-ba83-4b1f-a49a-769ca1911c9b","Type":"ContainerStarted","Data":"c83991847bf683630e70d44722d44695d9152a02d09f0a3d6fe39436ebbf262d"} Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.478431 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-556764fb84-r628x" event={"ID":"4ce27714-673f-47de-acc3-b6902b534bdd","Type":"ContainerStarted","Data":"51e9e5e71be46820f9c3d1564ff14b9e6df8988ed057a1326779c07f7fee3331"} Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.480459 4772 generic.go:334] "Generic (PLEG): container finished" podID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerID="7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375" exitCode=0 Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.480563 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" event={"ID":"eca3ffef-1a57-4aee-9302-64b59ee0fc44","Type":"ContainerDied","Data":"7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375"} Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.480623 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" event={"ID":"eca3ffef-1a57-4aee-9302-64b59ee0fc44","Type":"ContainerStarted","Data":"89fbde952ff7b7a7ea9f206fc34ae7d5aa1cfb885454833c03af5e3e355d5fb3"} Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.482105 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8l85z" event={"ID":"9ae05919-68bf-43d1-abd9-9908ec287bd0","Type":"ContainerDied","Data":"6c7bfeb67dfdf4e440bd40114d111aab9077e461f93cb6bdda5f337cad29c97d"} Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.482131 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7bfeb67dfdf4e440bd40114d111aab9077e461f93cb6bdda5f337cad29c97d" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.482246 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8l85z" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.484520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb9d976b-flrwl" event={"ID":"a02a1b6c-d438-42bf-a577-88bbbcca2a00","Type":"ContainerStarted","Data":"37bb9f56f01e20e1b9f2066e0e10a19c1b316c11b490381320b663b46a9cc874"} Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.694002 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:27:49 crc kubenswrapper[4772]: E0127 15:27:49.694489 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ae05919-68bf-43d1-abd9-9908ec287bd0" containerName="cinder-db-sync" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.694506 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ae05919-68bf-43d1-abd9-9908ec287bd0" containerName="cinder-db-sync" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.694682 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ae05919-68bf-43d1-abd9-9908ec287bd0" containerName="cinder-db-sync" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.696582 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.703691 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.703840 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.704052 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.704184 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8nhs4" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.715682 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.768206 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b7d9f3f-e366-421e-b00d-9c453da1adca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.774244 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.774647 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.774707 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tvv\" (UniqueName: \"kubernetes.io/projected/7b7d9f3f-e366-421e-b00d-9c453da1adca-kube-api-access-g2tvv\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.774761 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.774849 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.778012 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-62kx4"] Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.823515 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-gszgg"] Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.825075 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.881041 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.881417 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tvv\" (UniqueName: \"kubernetes.io/projected/7b7d9f3f-e366-421e-b00d-9c453da1adca-kube-api-access-g2tvv\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.881456 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.881509 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.881554 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b7d9f3f-e366-421e-b00d-9c453da1adca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.881591 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.882611 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b7d9f3f-e366-421e-b00d-9c453da1adca-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.892081 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-gszgg"] Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.899486 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.900732 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.901394 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.901471 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.909803 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tvv\" (UniqueName: \"kubernetes.io/projected/7b7d9f3f-e366-421e-b00d-9c453da1adca-kube-api-access-g2tvv\") pod \"cinder-scheduler-0\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " pod="openstack/cinder-scheduler-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.971284 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.972961 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:27:49 crc kubenswrapper[4772]: I0127 15:27:49.998890 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:49.999393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpwmg\" (UniqueName: \"kubernetes.io/projected/ce081402-0ada-4fbf-8b22-eb88a50e804b-kube-api-access-hpwmg\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:49.999485 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:49.999514 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-config\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:49.999593 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:49.999648 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:49.999731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.028800 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.051653 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.101798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.101859 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-config\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.101899 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2r7\" (UniqueName: \"kubernetes.io/projected/da515cae-40ac-41af-aef5-9cef9f3b366e-kube-api-access-4l2r7\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.101923 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da515cae-40ac-41af-aef5-9cef9f3b366e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.101961 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data-custom\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.101997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102045 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102069 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102105 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-scripts\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102240 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102277 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpwmg\" (UniqueName: \"kubernetes.io/projected/ce081402-0ada-4fbf-8b22-eb88a50e804b-kube-api-access-hpwmg\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102325 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da515cae-40ac-41af-aef5-9cef9f3b366e-logs\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.102351 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.103733 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-config\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.103803 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.103979 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.105313 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.105462 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.136590 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpwmg\" (UniqueName: \"kubernetes.io/projected/ce081402-0ada-4fbf-8b22-eb88a50e804b-kube-api-access-hpwmg\") pod \"dnsmasq-dns-69c986f6d7-gszgg\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.176155 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.203819 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2r7\" (UniqueName: \"kubernetes.io/projected/da515cae-40ac-41af-aef5-9cef9f3b366e-kube-api-access-4l2r7\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204012 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da515cae-40ac-41af-aef5-9cef9f3b366e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204047 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data-custom\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da515cae-40ac-41af-aef5-9cef9f3b366e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204116 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204155 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-scripts\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204267 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da515cae-40ac-41af-aef5-9cef9f3b366e-logs\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.204294 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.208697 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.209439 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da515cae-40ac-41af-aef5-9cef9f3b366e-logs\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.210853 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-scripts\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.212496 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.222653 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2r7\" (UniqueName: \"kubernetes.io/projected/da515cae-40ac-41af-aef5-9cef9f3b366e-kube-api-access-4l2r7\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.224681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data-custom\") pod \"cinder-api-0\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.332696 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.455093 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.512425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" event={"ID":"eca3ffef-1a57-4aee-9302-64b59ee0fc44","Type":"ContainerStarted","Data":"3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d"} Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.512522 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerName="dnsmasq-dns" containerID="cri-o://3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d" gracePeriod=10 Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.512801 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.517970 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b7d9f3f-e366-421e-b00d-9c453da1adca","Type":"ContainerStarted","Data":"60d07ef91bf8ed929ad3a7649d54a1c26d24e5b004c0a1e93cdd8cb08880d285"} Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.520232 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb9d976b-flrwl" event={"ID":"a02a1b6c-d438-42bf-a577-88bbbcca2a00","Type":"ContainerStarted","Data":"51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3"} Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.520263 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb9d976b-flrwl" event={"ID":"a02a1b6c-d438-42bf-a577-88bbbcca2a00","Type":"ContainerStarted","Data":"6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab"} Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.520460 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.520566 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.565720 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" podStartSLOduration=3.565698475 podStartE2EDuration="3.565698475s" podCreationTimestamp="2026-01-27 15:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:50.550240136 +0000 UTC m=+1256.530849244" watchObservedRunningTime="2026-01-27 15:27:50.565698475 +0000 UTC m=+1256.546307573" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.577041 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-cb9d976b-flrwl" podStartSLOduration=3.5770209250000002 podStartE2EDuration="3.577020925s" podCreationTimestamp="2026-01-27 15:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:50.572659858 +0000 UTC m=+1256.553268986" watchObservedRunningTime="2026-01-27 15:27:50.577020925 +0000 UTC m=+1256.557630023" Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.725266 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-gszgg"] Jan 27 15:27:50 crc kubenswrapper[4772]: I0127 15:27:50.936923 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:51 crc kubenswrapper[4772]: W0127 15:27:51.256822 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda515cae_40ac_41af_aef5_9cef9f3b366e.slice/crio-005ceeb7f240212b9c37236e7c8187a3f42d3b14297b7bdfc9d9274e9fec1f9d WatchSource:0}: Error finding container 005ceeb7f240212b9c37236e7c8187a3f42d3b14297b7bdfc9d9274e9fec1f9d: Status 404 returned error can't find the container with id 005ceeb7f240212b9c37236e7c8187a3f42d3b14297b7bdfc9d9274e9fec1f9d Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.325868 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.434717 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-nb\") pod \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.435995 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vxqs\" (UniqueName: \"kubernetes.io/projected/eca3ffef-1a57-4aee-9302-64b59ee0fc44-kube-api-access-4vxqs\") pod \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.436028 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-swift-storage-0\") pod \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.436114 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-sb\") pod \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.436156 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-config\") pod \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.436195 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-svc\") pod \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\" (UID: \"eca3ffef-1a57-4aee-9302-64b59ee0fc44\") " Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.445040 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca3ffef-1a57-4aee-9302-64b59ee0fc44-kube-api-access-4vxqs" (OuterVolumeSpecName: "kube-api-access-4vxqs") pod "eca3ffef-1a57-4aee-9302-64b59ee0fc44" (UID: "eca3ffef-1a57-4aee-9302-64b59ee0fc44"). InnerVolumeSpecName "kube-api-access-4vxqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.493348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eca3ffef-1a57-4aee-9302-64b59ee0fc44" (UID: "eca3ffef-1a57-4aee-9302-64b59ee0fc44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.494528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eca3ffef-1a57-4aee-9302-64b59ee0fc44" (UID: "eca3ffef-1a57-4aee-9302-64b59ee0fc44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.505754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eca3ffef-1a57-4aee-9302-64b59ee0fc44" (UID: "eca3ffef-1a57-4aee-9302-64b59ee0fc44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.508736 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eca3ffef-1a57-4aee-9302-64b59ee0fc44" (UID: "eca3ffef-1a57-4aee-9302-64b59ee0fc44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.536370 4772 generic.go:334] "Generic (PLEG): container finished" podID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerID="3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d" exitCode=0 Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.536433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" event={"ID":"eca3ffef-1a57-4aee-9302-64b59ee0fc44","Type":"ContainerDied","Data":"3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d"} Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.536459 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" event={"ID":"eca3ffef-1a57-4aee-9302-64b59ee0fc44","Type":"ContainerDied","Data":"89fbde952ff7b7a7ea9f206fc34ae7d5aa1cfb885454833c03af5e3e355d5fb3"} Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.536475 4772 scope.go:117] "RemoveContainer" containerID="3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.536588 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-62kx4" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.537896 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.537917 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vxqs\" (UniqueName: \"kubernetes.io/projected/eca3ffef-1a57-4aee-9302-64b59ee0fc44-kube-api-access-4vxqs\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.537927 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.537938 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.537947 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.540789 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-config" (OuterVolumeSpecName: "config") pod "eca3ffef-1a57-4aee-9302-64b59ee0fc44" (UID: "eca3ffef-1a57-4aee-9302-64b59ee0fc44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.547349 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da515cae-40ac-41af-aef5-9cef9f3b366e","Type":"ContainerStarted","Data":"005ceeb7f240212b9c37236e7c8187a3f42d3b14297b7bdfc9d9274e9fec1f9d"} Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.554227 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" event={"ID":"ce081402-0ada-4fbf-8b22-eb88a50e804b","Type":"ContainerStarted","Data":"59a55a087cb1a139692664b7e8a0e67c4b8ddc2ffa8baef3cc11190a48375152"} Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.639154 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eca3ffef-1a57-4aee-9302-64b59ee0fc44-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.885150 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-62kx4"] Jan 27 15:27:51 crc kubenswrapper[4772]: I0127 15:27:51.901891 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-62kx4"] Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.062397 4772 scope.go:117] "RemoveContainer" containerID="7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375" Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.108001 4772 scope.go:117] "RemoveContainer" containerID="3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d" Jan 27 15:27:52 crc kubenswrapper[4772]: E0127 15:27:52.114993 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d\": container with ID starting with 3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d not found: ID does not exist" containerID="3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d" Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.115052 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d"} err="failed to get container status \"3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d\": rpc error: code = NotFound desc = could not find container \"3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d\": container with ID starting with 3c25ec3ecdb9afdacbc10ba3b1a990e70919f2f5f3c0a3614045131a2f37d22d not found: ID does not exist" Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.115087 4772 scope.go:117] "RemoveContainer" containerID="7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375" Jan 27 15:27:52 crc kubenswrapper[4772]: E0127 15:27:52.115592 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375\": container with ID starting with 7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375 not found: ID does not exist" containerID="7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375" Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.115616 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375"} err="failed to get container status \"7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375\": rpc error: code = NotFound desc = could not find container \"7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375\": container with ID starting with 7edcc63bf9420fd4c188dbb2b09e3dd62f71b1aeec4c1159720516f01b6da375 not found: ID does not exist" Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.570955 4772 generic.go:334] "Generic (PLEG): container finished" podID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerID="dc64f95e60b6b7231bfc72557652d02f45abe891bf9e4367174098a9820da0e9" exitCode=0 Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.571270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" event={"ID":"ce081402-0ada-4fbf-8b22-eb88a50e804b","Type":"ContainerDied","Data":"dc64f95e60b6b7231bfc72557652d02f45abe891bf9e4367174098a9820da0e9"} Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.698010 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" path="/var/lib/kubelet/pods/eca3ffef-1a57-4aee-9302-64b59ee0fc44/volumes" Jan 27 15:27:52 crc kubenswrapper[4772]: I0127 15:27:52.772224 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.674930 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6748df9c8c-zk7zp" event={"ID":"710edaa6-ba83-4b1f-a49a-769ca1911c9b","Type":"ContainerStarted","Data":"a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.675206 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6748df9c8c-zk7zp" event={"ID":"710edaa6-ba83-4b1f-a49a-769ca1911c9b","Type":"ContainerStarted","Data":"b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.699652 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6748df9c8c-zk7zp" podStartSLOduration=3.107491642 podStartE2EDuration="6.699638544s" podCreationTimestamp="2026-01-27 15:27:47 +0000 UTC" firstStartedPulling="2026-01-27 15:27:48.641318958 +0000 UTC m=+1254.621928066" lastFinishedPulling="2026-01-27 15:27:52.23346587 +0000 UTC m=+1258.214074968" observedRunningTime="2026-01-27 15:27:53.697718438 +0000 UTC m=+1259.678327536" watchObservedRunningTime="2026-01-27 15:27:53.699638544 +0000 UTC m=+1259.680247642" Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.700690 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-556764fb84-r628x" event={"ID":"4ce27714-673f-47de-acc3-b6902b534bdd","Type":"ContainerStarted","Data":"aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.700741 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-556764fb84-r628x" event={"ID":"4ce27714-673f-47de-acc3-b6902b534bdd","Type":"ContainerStarted","Data":"f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.718353 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b7d9f3f-e366-421e-b00d-9c453da1adca","Type":"ContainerStarted","Data":"55338fb54abd9ad2a096debb2356f749682191abe6b851127e0e95fcec09a654"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.740086 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-556764fb84-r628x" podStartSLOduration=3.302596144 podStartE2EDuration="6.740063209s" podCreationTimestamp="2026-01-27 15:27:47 +0000 UTC" firstStartedPulling="2026-01-27 15:27:48.69472078 +0000 UTC m=+1254.675329878" lastFinishedPulling="2026-01-27 15:27:52.132187845 +0000 UTC m=+1258.112796943" observedRunningTime="2026-01-27 15:27:53.725451555 +0000 UTC m=+1259.706060653" watchObservedRunningTime="2026-01-27 15:27:53.740063209 +0000 UTC m=+1259.720672307" Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.765260 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da515cae-40ac-41af-aef5-9cef9f3b366e","Type":"ContainerStarted","Data":"cd2c8c5778f70b0c56d0f79379ec5079a2ceb539c1d125108e64ede41e68dc9b"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.775537 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" event={"ID":"ce081402-0ada-4fbf-8b22-eb88a50e804b","Type":"ContainerStarted","Data":"672b3b869d3cac682f2871f1cccebaabe0e2baa030c87285ea2dc87b60951bbf"} Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.775852 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:53 crc kubenswrapper[4772]: I0127 15:27:53.802104 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" podStartSLOduration=4.802081813 podStartE2EDuration="4.802081813s" podCreationTimestamp="2026-01-27 15:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:53.799205539 +0000 UTC m=+1259.779814637" watchObservedRunningTime="2026-01-27 15:27:53.802081813 +0000 UTC m=+1259.782690911" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.353678 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.679802 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-659485ddbb-5bnzg"] Jan 27 15:27:54 crc kubenswrapper[4772]: E0127 15:27:54.680132 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerName="dnsmasq-dns" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.680147 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerName="dnsmasq-dns" Jan 27 15:27:54 crc kubenswrapper[4772]: E0127 15:27:54.680186 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerName="init" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.680195 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerName="init" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.680411 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca3ffef-1a57-4aee-9302-64b59ee0fc44" containerName="dnsmasq-dns" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.681388 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.684031 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.687672 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.692528 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-659485ddbb-5bnzg"] Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.759511 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.791308 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b7d9f3f-e366-421e-b00d-9c453da1adca","Type":"ContainerStarted","Data":"7232aaea46edf2977ebe24eeb1188331c23b40d3efeaaca7dfc75e3658d209b6"} Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.809629 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da515cae-40ac-41af-aef5-9cef9f3b366e","Type":"ContainerStarted","Data":"537d1bcf9f874816a464389d56071423783d326469277b895f554d80f412795e"} Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.809638 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api-log" containerID="cri-o://cd2c8c5778f70b0c56d0f79379ec5079a2ceb539c1d125108e64ede41e68dc9b" gracePeriod=30 Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.809683 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.809771 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api" containerID="cri-o://537d1bcf9f874816a464389d56071423783d326469277b895f554d80f412795e" gracePeriod=30 Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.815756 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.873166652 podStartE2EDuration="5.815742358s" podCreationTimestamp="2026-01-27 15:27:49 +0000 UTC" firstStartedPulling="2026-01-27 15:27:50.463871474 +0000 UTC m=+1256.444480572" lastFinishedPulling="2026-01-27 15:27:52.40644718 +0000 UTC m=+1258.387056278" observedRunningTime="2026-01-27 15:27:54.81444356 +0000 UTC m=+1260.795052658" watchObservedRunningTime="2026-01-27 15:27:54.815742358 +0000 UTC m=+1260.796351456" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-public-tls-certs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/766c2a26-46ea-41b2-ba0c-2101ec9477d5-logs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826732 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-combined-ca-bundle\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826761 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-internal-tls-certs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826816 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data-custom\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.826913 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqskw\" (UniqueName: \"kubernetes.io/projected/766c2a26-46ea-41b2-ba0c-2101ec9477d5-kube-api-access-sqskw\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.864075 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.8640541729999995 podStartE2EDuration="5.864054173s" podCreationTimestamp="2026-01-27 15:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:54.847566254 +0000 UTC m=+1260.828175372" watchObservedRunningTime="2026-01-27 15:27:54.864054173 +0000 UTC m=+1260.844663271" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-combined-ca-bundle\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929330 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-internal-tls-certs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929396 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data-custom\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929475 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqskw\" (UniqueName: \"kubernetes.io/projected/766c2a26-46ea-41b2-ba0c-2101ec9477d5-kube-api-access-sqskw\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929579 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-public-tls-certs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.929597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/766c2a26-46ea-41b2-ba0c-2101ec9477d5-logs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.931300 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/766c2a26-46ea-41b2-ba0c-2101ec9477d5-logs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.943125 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-combined-ca-bundle\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.944225 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.944722 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-public-tls-certs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.944859 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-internal-tls-certs\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.951280 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data-custom\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:54 crc kubenswrapper[4772]: I0127 15:27:54.981413 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqskw\" (UniqueName: \"kubernetes.io/projected/766c2a26-46ea-41b2-ba0c-2101ec9477d5-kube-api-access-sqskw\") pod \"barbican-api-659485ddbb-5bnzg\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.036255 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.052764 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.729862 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-659485ddbb-5bnzg"] Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.825001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-659485ddbb-5bnzg" event={"ID":"766c2a26-46ea-41b2-ba0c-2101ec9477d5","Type":"ContainerStarted","Data":"181aa9237802812b703a88787d1d6892177f6147a0214d407241520c82b45857"} Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.829930 4772 generic.go:334] "Generic (PLEG): container finished" podID="b0625578-3b48-44c7-9082-174fce3a7e74" containerID="d2de8b3a1c27ebd01b5c3393c6dcb85d202fe549eef0c41d0f9f318c3b15d219" exitCode=0 Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.829989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v689b" event={"ID":"b0625578-3b48-44c7-9082-174fce3a7e74","Type":"ContainerDied","Data":"d2de8b3a1c27ebd01b5c3393c6dcb85d202fe549eef0c41d0f9f318c3b15d219"} Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.833099 4772 generic.go:334] "Generic (PLEG): container finished" podID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerID="537d1bcf9f874816a464389d56071423783d326469277b895f554d80f412795e" exitCode=0 Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.833319 4772 generic.go:334] "Generic (PLEG): container finished" podID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerID="cd2c8c5778f70b0c56d0f79379ec5079a2ceb539c1d125108e64ede41e68dc9b" exitCode=143 Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.833347 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da515cae-40ac-41af-aef5-9cef9f3b366e","Type":"ContainerDied","Data":"537d1bcf9f874816a464389d56071423783d326469277b895f554d80f412795e"} Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.834429 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da515cae-40ac-41af-aef5-9cef9f3b366e","Type":"ContainerDied","Data":"cd2c8c5778f70b0c56d0f79379ec5079a2ceb539c1d125108e64ede41e68dc9b"} Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.841094 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.973937 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.974253 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-combined-ca-bundle\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.974397 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da515cae-40ac-41af-aef5-9cef9f3b366e-etc-machine-id\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.974549 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l2r7\" (UniqueName: \"kubernetes.io/projected/da515cae-40ac-41af-aef5-9cef9f3b366e-kube-api-access-4l2r7\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.974655 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da515cae-40ac-41af-aef5-9cef9f3b366e-logs\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.974790 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data-custom\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.975050 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-scripts\") pod \"da515cae-40ac-41af-aef5-9cef9f3b366e\" (UID: \"da515cae-40ac-41af-aef5-9cef9f3b366e\") " Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.975270 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da515cae-40ac-41af-aef5-9cef9f3b366e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.975533 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da515cae-40ac-41af-aef5-9cef9f3b366e-logs" (OuterVolumeSpecName: "logs") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.975996 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da515cae-40ac-41af-aef5-9cef9f3b366e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.976092 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da515cae-40ac-41af-aef5-9cef9f3b366e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.980618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-scripts" (OuterVolumeSpecName: "scripts") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.985690 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da515cae-40ac-41af-aef5-9cef9f3b366e-kube-api-access-4l2r7" (OuterVolumeSpecName: "kube-api-access-4l2r7") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "kube-api-access-4l2r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:55 crc kubenswrapper[4772]: I0127 15:27:55.993663 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.063796 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.077862 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.077897 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.077910 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l2r7\" (UniqueName: \"kubernetes.io/projected/da515cae-40ac-41af-aef5-9cef9f3b366e-kube-api-access-4l2r7\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.077920 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.102293 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data" (OuterVolumeSpecName: "config-data") pod "da515cae-40ac-41af-aef5-9cef9f3b366e" (UID: "da515cae-40ac-41af-aef5-9cef9f3b366e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.179703 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da515cae-40ac-41af-aef5-9cef9f3b366e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.187945 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.845597 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.845579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"da515cae-40ac-41af-aef5-9cef9f3b366e","Type":"ContainerDied","Data":"005ceeb7f240212b9c37236e7c8187a3f42d3b14297b7bdfc9d9274e9fec1f9d"} Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.846072 4772 scope.go:117] "RemoveContainer" containerID="537d1bcf9f874816a464389d56071423783d326469277b895f554d80f412795e" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.849351 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-659485ddbb-5bnzg" event={"ID":"766c2a26-46ea-41b2-ba0c-2101ec9477d5","Type":"ContainerStarted","Data":"ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874"} Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.849405 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-659485ddbb-5bnzg" event={"ID":"766c2a26-46ea-41b2-ba0c-2101ec9477d5","Type":"ContainerStarted","Data":"23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573"} Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.878701 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-659485ddbb-5bnzg" podStartSLOduration=2.878681384 podStartE2EDuration="2.878681384s" podCreationTimestamp="2026-01-27 15:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:27:56.870958299 +0000 UTC m=+1262.851567397" watchObservedRunningTime="2026-01-27 15:27:56.878681384 +0000 UTC m=+1262.859290482" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.895244 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.898009 4772 scope.go:117] "RemoveContainer" containerID="cd2c8c5778f70b0c56d0f79379ec5079a2ceb539c1d125108e64ede41e68dc9b" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.904073 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.920230 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:56 crc kubenswrapper[4772]: E0127 15:27:56.920613 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.920632 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api" Jan 27 15:27:56 crc kubenswrapper[4772]: E0127 15:27:56.920653 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api-log" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.920660 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api-log" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.920821 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.920841 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" containerName="cinder-api-log" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.923252 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.929181 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.929476 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.929817 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.933536 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994068 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994184 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data-custom\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994223 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-scripts\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994244 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994314 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vclj\" (UniqueName: \"kubernetes.io/projected/be772158-a71c-448d-8972-014f0d3a9ab8-kube-api-access-2vclj\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994359 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994490 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be772158-a71c-448d-8972-014f0d3a9ab8-logs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994528 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:56 crc kubenswrapper[4772]: I0127 15:27:56.994604 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be772158-a71c-448d-8972-014f0d3a9ab8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097302 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data-custom\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097353 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-scripts\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097402 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vclj\" (UniqueName: \"kubernetes.io/projected/be772158-a71c-448d-8972-014f0d3a9ab8-kube-api-access-2vclj\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097531 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be772158-a71c-448d-8972-014f0d3a9ab8-logs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097554 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097611 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be772158-a71c-448d-8972-014f0d3a9ab8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.097683 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be772158-a71c-448d-8972-014f0d3a9ab8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.101737 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be772158-a71c-448d-8972-014f0d3a9ab8-logs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.102938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-public-tls-certs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.103045 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.103799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.105989 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.111209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data-custom\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.115246 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-scripts\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.133504 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vclj\" (UniqueName: \"kubernetes.io/projected/be772158-a71c-448d-8972-014f0d3a9ab8-kube-api-access-2vclj\") pod \"cinder-api-0\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.271664 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.316212 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v689b" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.409028 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-combined-ca-bundle\") pod \"b0625578-3b48-44c7-9082-174fce3a7e74\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.409099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-config\") pod \"b0625578-3b48-44c7-9082-174fce3a7e74\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.409235 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jn9b\" (UniqueName: \"kubernetes.io/projected/b0625578-3b48-44c7-9082-174fce3a7e74-kube-api-access-9jn9b\") pod \"b0625578-3b48-44c7-9082-174fce3a7e74\" (UID: \"b0625578-3b48-44c7-9082-174fce3a7e74\") " Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.415348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0625578-3b48-44c7-9082-174fce3a7e74-kube-api-access-9jn9b" (OuterVolumeSpecName: "kube-api-access-9jn9b") pod "b0625578-3b48-44c7-9082-174fce3a7e74" (UID: "b0625578-3b48-44c7-9082-174fce3a7e74"). InnerVolumeSpecName "kube-api-access-9jn9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.434488 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0625578-3b48-44c7-9082-174fce3a7e74" (UID: "b0625578-3b48-44c7-9082-174fce3a7e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.436800 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-config" (OuterVolumeSpecName: "config") pod "b0625578-3b48-44c7-9082-174fce3a7e74" (UID: "b0625578-3b48-44c7-9082-174fce3a7e74"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.512650 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jn9b\" (UniqueName: \"kubernetes.io/projected/b0625578-3b48-44c7-9082-174fce3a7e74-kube-api-access-9jn9b\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.512694 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.512732 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b0625578-3b48-44c7-9082-174fce3a7e74-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.857345 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 27 15:27:57 crc kubenswrapper[4772]: E0127 15:27:57.859052 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0625578-3b48-44c7-9082-174fce3a7e74" containerName="neutron-db-sync" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.859073 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0625578-3b48-44c7-9082-174fce3a7e74" containerName="neutron-db-sync" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.859339 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0625578-3b48-44c7-9082-174fce3a7e74" containerName="neutron-db-sync" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.860123 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.873567 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.876579 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.876651 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7d9v8" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.876771 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.884905 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-v689b" event={"ID":"b0625578-3b48-44c7-9082-174fce3a7e74","Type":"ContainerDied","Data":"b3123ce803c91e7738d4af911f91769cd0703aad347549f4989b2ccc532f36ea"} Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.884963 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3123ce803c91e7738d4af911f91769cd0703aad347549f4989b2ccc532f36ea" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.884963 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-v689b" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.893840 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:57 crc kubenswrapper[4772]: I0127 15:27:57.893894 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.037800 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.037879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.037919 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config-secret\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.038113 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhnj\" (UniqueName: \"kubernetes.io/projected/0edf6707-14dd-4986-8d64-0e48a31d6a39-kube-api-access-xkhnj\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.082095 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-gszgg"] Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.087825 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerName="dnsmasq-dns" containerID="cri-o://672b3b869d3cac682f2871f1cccebaabe0e2baa030c87285ea2dc87b60951bbf" gracePeriod=10 Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.091387 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.121583 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2849v"] Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.123236 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.144973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhnj\" (UniqueName: \"kubernetes.io/projected/0edf6707-14dd-4986-8d64-0e48a31d6a39-kube-api-access-xkhnj\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.145065 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.145114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.145138 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config-secret\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.147898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.155186 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config-secret\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.157035 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.159242 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2849v"] Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.179814 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhnj\" (UniqueName: \"kubernetes.io/projected/0edf6707-14dd-4986-8d64-0e48a31d6a39-kube-api-access-xkhnj\") pod \"openstackclient\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.194983 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.250146 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-svc\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.250241 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.250280 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-config\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.250366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s774r\" (UniqueName: \"kubernetes.io/projected/4ab060da-8587-413a-a410-ee0e9cec40c6-kube-api-access-s774r\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.250397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.250423 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.298094 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66bf894476-wz7b5"] Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.309314 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.313275 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.313505 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.313652 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.316500 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rjm9r" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.320776 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66bf894476-wz7b5"] Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.355307 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s774r\" (UniqueName: \"kubernetes.io/projected/4ab060da-8587-413a-a410-ee0e9cec40c6-kube-api-access-s774r\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.355368 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.355405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.355442 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-svc\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.355481 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.355521 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-config\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.356589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-config\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.356872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.357156 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.357254 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-svc\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.357796 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.391105 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s774r\" (UniqueName: \"kubernetes.io/projected/4ab060da-8587-413a-a410-ee0e9cec40c6-kube-api-access-s774r\") pod \"dnsmasq-dns-5784cf869f-2849v\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.457381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8pb\" (UniqueName: \"kubernetes.io/projected/e7385520-8ffb-40e5-802e-ff0db348c5c1-kube-api-access-bb8pb\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.457464 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-combined-ca-bundle\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.457528 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-config\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.457798 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-ovndb-tls-certs\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.457884 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-httpd-config\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.560276 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8pb\" (UniqueName: \"kubernetes.io/projected/e7385520-8ffb-40e5-802e-ff0db348c5c1-kube-api-access-bb8pb\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.560379 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-combined-ca-bundle\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.560431 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-config\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.560597 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-ovndb-tls-certs\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.560643 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-httpd-config\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.565218 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-ovndb-tls-certs\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.566487 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-combined-ca-bundle\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.567216 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-config\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.579011 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.581878 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-httpd-config\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.583900 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8pb\" (UniqueName: \"kubernetes.io/projected/e7385520-8ffb-40e5-802e-ff0db348c5c1-kube-api-access-bb8pb\") pod \"neutron-66bf894476-wz7b5\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.639245 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.681966 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da515cae-40ac-41af-aef5-9cef9f3b366e" path="/var/lib/kubelet/pods/da515cae-40ac-41af-aef5-9cef9f3b366e/volumes" Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.969839 4772 generic.go:334] "Generic (PLEG): container finished" podID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerID="672b3b869d3cac682f2871f1cccebaabe0e2baa030c87285ea2dc87b60951bbf" exitCode=0 Jan 27 15:27:58 crc kubenswrapper[4772]: I0127 15:27:58.971692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" event={"ID":"ce081402-0ada-4fbf-8b22-eb88a50e804b","Type":"ContainerDied","Data":"672b3b869d3cac682f2871f1cccebaabe0e2baa030c87285ea2dc87b60951bbf"} Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.477889 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.569939 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.590401 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-config\") pod \"ce081402-0ada-4fbf-8b22-eb88a50e804b\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.590503 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpwmg\" (UniqueName: \"kubernetes.io/projected/ce081402-0ada-4fbf-8b22-eb88a50e804b-kube-api-access-hpwmg\") pod \"ce081402-0ada-4fbf-8b22-eb88a50e804b\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.590646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-sb\") pod \"ce081402-0ada-4fbf-8b22-eb88a50e804b\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.590752 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-nb\") pod \"ce081402-0ada-4fbf-8b22-eb88a50e804b\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.590787 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-swift-storage-0\") pod \"ce081402-0ada-4fbf-8b22-eb88a50e804b\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.590811 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-svc\") pod \"ce081402-0ada-4fbf-8b22-eb88a50e804b\" (UID: \"ce081402-0ada-4fbf-8b22-eb88a50e804b\") " Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.597873 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.626649 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce081402-0ada-4fbf-8b22-eb88a50e804b-kube-api-access-hpwmg" (OuterVolumeSpecName: "kube-api-access-hpwmg") pod "ce081402-0ada-4fbf-8b22-eb88a50e804b" (UID: "ce081402-0ada-4fbf-8b22-eb88a50e804b"). InnerVolumeSpecName "kube-api-access-hpwmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:27:59 crc kubenswrapper[4772]: W0127 15:27:59.654309 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe772158_a71c_448d_8972_014f0d3a9ab8.slice/crio-2e68d940e0eebbc1216da3357187ae70827b7d508fb0a26f0e91d9593aac8852 WatchSource:0}: Error finding container 2e68d940e0eebbc1216da3357187ae70827b7d508fb0a26f0e91d9593aac8852: Status 404 returned error can't find the container with id 2e68d940e0eebbc1216da3357187ae70827b7d508fb0a26f0e91d9593aac8852 Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.692556 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpwmg\" (UniqueName: \"kubernetes.io/projected/ce081402-0ada-4fbf-8b22-eb88a50e804b-kube-api-access-hpwmg\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.806349 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2849v"] Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.807457 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce081402-0ada-4fbf-8b22-eb88a50e804b" (UID: "ce081402-0ada-4fbf-8b22-eb88a50e804b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.813773 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce081402-0ada-4fbf-8b22-eb88a50e804b" (UID: "ce081402-0ada-4fbf-8b22-eb88a50e804b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.841931 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66bf894476-wz7b5"] Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.862604 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-config" (OuterVolumeSpecName: "config") pod "ce081402-0ada-4fbf-8b22-eb88a50e804b" (UID: "ce081402-0ada-4fbf-8b22-eb88a50e804b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.892932 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce081402-0ada-4fbf-8b22-eb88a50e804b" (UID: "ce081402-0ada-4fbf-8b22-eb88a50e804b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.898719 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.898762 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.898781 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.898796 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.902492 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce081402-0ada-4fbf-8b22-eb88a50e804b" (UID: "ce081402-0ada-4fbf-8b22-eb88a50e804b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:27:59 crc kubenswrapper[4772]: I0127 15:27:59.994642 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"be772158-a71c-448d-8972-014f0d3a9ab8","Type":"ContainerStarted","Data":"2e68d940e0eebbc1216da3357187ae70827b7d508fb0a26f0e91d9593aac8852"} Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.001071 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce081402-0ada-4fbf-8b22-eb88a50e804b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.008587 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" event={"ID":"ce081402-0ada-4fbf-8b22-eb88a50e804b","Type":"ContainerDied","Data":"59a55a087cb1a139692664b7e8a0e67c4b8ddc2ffa8baef3cc11190a48375152"} Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.008601 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-gszgg" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.008642 4772 scope.go:117] "RemoveContainer" containerID="672b3b869d3cac682f2871f1cccebaabe0e2baa030c87285ea2dc87b60951bbf" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.031647 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bf894476-wz7b5" event={"ID":"e7385520-8ffb-40e5-802e-ff0db348c5c1","Type":"ContainerStarted","Data":"d30df4d73e5cfb24af9149a2561d9917bae2965d7778272568f0bfb1966f855f"} Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.064564 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-gszgg"] Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.064623 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2849v" event={"ID":"4ab060da-8587-413a-a410-ee0e9cec40c6","Type":"ContainerStarted","Data":"894acb53cf18e87f7e2c3c3b36e872c3494f3f5477487f3a865082faee222b95"} Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.075768 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-gszgg"] Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.083454 4772 scope.go:117] "RemoveContainer" containerID="dc64f95e60b6b7231bfc72557652d02f45abe891bf9e4367174098a9820da0e9" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.084291 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0edf6707-14dd-4986-8d64-0e48a31d6a39","Type":"ContainerStarted","Data":"5c52f5cf3b82427db2a187bbd0708a64e4f14f826b96324500c229ad2e72a4cf"} Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.385961 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.607442 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.616843 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.688706 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" path="/var/lib/kubelet/pods/ce081402-0ada-4fbf-8b22-eb88a50e804b/volumes" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.689360 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-597699949b-q6msx" Jan 27 15:28:00 crc kubenswrapper[4772]: I0127 15:28:00.940733 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.154198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bf894476-wz7b5" event={"ID":"e7385520-8ffb-40e5-802e-ff0db348c5c1","Type":"ContainerStarted","Data":"b5a8f7019a8ae14ffdea4c25f43d7ff45e4469316acbf03b2364b347f5933e7c"} Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.154234 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bf894476-wz7b5" event={"ID":"e7385520-8ffb-40e5-802e-ff0db348c5c1","Type":"ContainerStarted","Data":"8859f4bb50887ba9951c0e2249a3e56deff79409c3a080683519e71c92360a6d"} Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.154531 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.161844 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerID="d9accb6fdf89d9e80604718e7ef1a89b03857412edcadace6f8fed5be8e5dfab" exitCode=0 Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.161947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2849v" event={"ID":"4ab060da-8587-413a-a410-ee0e9cec40c6","Type":"ContainerDied","Data":"d9accb6fdf89d9e80604718e7ef1a89b03857412edcadace6f8fed5be8e5dfab"} Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.175767 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"be772158-a71c-448d-8972-014f0d3a9ab8","Type":"ContainerStarted","Data":"26cc6d1f580535edc969fb0f7d0d2e7d716fa8450f944ca1657554f90801529b"} Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.175936 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="cinder-scheduler" containerID="cri-o://55338fb54abd9ad2a096debb2356f749682191abe6b851127e0e95fcec09a654" gracePeriod=30 Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.176132 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="probe" containerID="cri-o://7232aaea46edf2977ebe24eeb1188331c23b40d3efeaaca7dfc75e3658d209b6" gracePeriod=30 Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.187318 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66bf894476-wz7b5" podStartSLOduration=3.18729179 podStartE2EDuration="3.18729179s" podCreationTimestamp="2026-01-27 15:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:01.176434204 +0000 UTC m=+1267.157043312" watchObservedRunningTime="2026-01-27 15:28:01.18729179 +0000 UTC m=+1267.167900888" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.252063 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-597699949b-q6msx" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.708809 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-647c88bb6f-wzf82"] Jan 27 15:28:01 crc kubenswrapper[4772]: E0127 15:28:01.721828 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerName="init" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.721874 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerName="init" Jan 27 15:28:01 crc kubenswrapper[4772]: E0127 15:28:01.721887 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerName="dnsmasq-dns" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.721895 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerName="dnsmasq-dns" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.722161 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce081402-0ada-4fbf-8b22-eb88a50e804b" containerName="dnsmasq-dns" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.723369 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.727205 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.727385 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.733885 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647c88bb6f-wzf82"] Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-httpd-config\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876417 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-internal-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876483 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-public-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876538 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-combined-ca-bundle\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876561 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-ovndb-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876604 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brxbx\" (UniqueName: \"kubernetes.io/projected/6cf131c4-a5bd-452b-8598-42312c3a0270-kube-api-access-brxbx\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.876675 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-config\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978124 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-config\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978538 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-httpd-config\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978587 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-internal-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978630 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-public-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-combined-ca-bundle\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978815 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-ovndb-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.978938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brxbx\" (UniqueName: \"kubernetes.io/projected/6cf131c4-a5bd-452b-8598-42312c3a0270-kube-api-access-brxbx\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.986017 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-config\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.986879 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-ovndb-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.987952 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-internal-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.988645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-httpd-config\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.988884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-public-tls-certs\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.992004 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-combined-ca-bundle\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:01 crc kubenswrapper[4772]: I0127 15:28:01.994821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brxbx\" (UniqueName: \"kubernetes.io/projected/6cf131c4-a5bd-452b-8598-42312c3a0270-kube-api-access-brxbx\") pod \"neutron-647c88bb6f-wzf82\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.078946 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.216382 4772 generic.go:334] "Generic (PLEG): container finished" podID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerID="7232aaea46edf2977ebe24eeb1188331c23b40d3efeaaca7dfc75e3658d209b6" exitCode=0 Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.216467 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b7d9f3f-e366-421e-b00d-9c453da1adca","Type":"ContainerDied","Data":"7232aaea46edf2977ebe24eeb1188331c23b40d3efeaaca7dfc75e3658d209b6"} Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.251227 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2849v" event={"ID":"4ab060da-8587-413a-a410-ee0e9cec40c6","Type":"ContainerStarted","Data":"b3b0580b2d9a989010c2055ae938a024c281531976b862414fc303ffddcf01e5"} Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.251307 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.311745 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-2849v" podStartSLOduration=4.311722635 podStartE2EDuration="4.311722635s" podCreationTimestamp="2026-01-27 15:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:02.301499628 +0000 UTC m=+1268.282108746" watchObservedRunningTime="2026-01-27 15:28:02.311722635 +0000 UTC m=+1268.292331733" Jan 27 15:28:02 crc kubenswrapper[4772]: I0127 15:28:02.830060 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647c88bb6f-wzf82"] Jan 27 15:28:03 crc kubenswrapper[4772]: I0127 15:28:03.021265 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:28:03 crc kubenswrapper[4772]: I0127 15:28:03.279348 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"be772158-a71c-448d-8972-014f0d3a9ab8","Type":"ContainerStarted","Data":"c47159ab0aee5087f5a44073988d2ad8d6aaaa0e47ba7702dc2a03eab229b375"} Jan 27 15:28:03 crc kubenswrapper[4772]: I0127 15:28:03.279614 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 15:28:03 crc kubenswrapper[4772]: I0127 15:28:03.295046 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647c88bb6f-wzf82" event={"ID":"6cf131c4-a5bd-452b-8598-42312c3a0270","Type":"ContainerStarted","Data":"72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37"} Jan 27 15:28:03 crc kubenswrapper[4772]: I0127 15:28:03.295107 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647c88bb6f-wzf82" event={"ID":"6cf131c4-a5bd-452b-8598-42312c3a0270","Type":"ContainerStarted","Data":"eb3fc136e47d75ea92171b2a25f1728b294a61ff0f248fa056a324eadfc98f00"} Jan 27 15:28:03 crc kubenswrapper[4772]: I0127 15:28:03.321044 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.321024384 podStartE2EDuration="7.321024384s" podCreationTimestamp="2026-01-27 15:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:03.303089343 +0000 UTC m=+1269.283698441" watchObservedRunningTime="2026-01-27 15:28:03.321024384 +0000 UTC m=+1269.301633482" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.314946 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647c88bb6f-wzf82" event={"ID":"6cf131c4-a5bd-452b-8598-42312c3a0270","Type":"ContainerStarted","Data":"b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2"} Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.315101 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.319571 4772 generic.go:334] "Generic (PLEG): container finished" podID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerID="55338fb54abd9ad2a096debb2356f749682191abe6b851127e0e95fcec09a654" exitCode=0 Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.320745 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b7d9f3f-e366-421e-b00d-9c453da1adca","Type":"ContainerDied","Data":"55338fb54abd9ad2a096debb2356f749682191abe6b851127e0e95fcec09a654"} Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.337658 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-647c88bb6f-wzf82" podStartSLOduration=3.337639085 podStartE2EDuration="3.337639085s" podCreationTimestamp="2026-01-27 15:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:04.333508305 +0000 UTC m=+1270.314117403" watchObservedRunningTime="2026-01-27 15:28:04.337639085 +0000 UTC m=+1270.318248183" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.724513 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.756361 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tvv\" (UniqueName: \"kubernetes.io/projected/7b7d9f3f-e366-421e-b00d-9c453da1adca-kube-api-access-g2tvv\") pod \"7b7d9f3f-e366-421e-b00d-9c453da1adca\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.756439 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-scripts\") pod \"7b7d9f3f-e366-421e-b00d-9c453da1adca\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.756463 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b7d9f3f-e366-421e-b00d-9c453da1adca-etc-machine-id\") pod \"7b7d9f3f-e366-421e-b00d-9c453da1adca\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.756493 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data\") pod \"7b7d9f3f-e366-421e-b00d-9c453da1adca\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.756597 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-combined-ca-bundle\") pod \"7b7d9f3f-e366-421e-b00d-9c453da1adca\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.756669 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data-custom\") pod \"7b7d9f3f-e366-421e-b00d-9c453da1adca\" (UID: \"7b7d9f3f-e366-421e-b00d-9c453da1adca\") " Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.758613 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b7d9f3f-e366-421e-b00d-9c453da1adca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7b7d9f3f-e366-421e-b00d-9c453da1adca" (UID: "7b7d9f3f-e366-421e-b00d-9c453da1adca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.774141 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b7d9f3f-e366-421e-b00d-9c453da1adca-kube-api-access-g2tvv" (OuterVolumeSpecName: "kube-api-access-g2tvv") pod "7b7d9f3f-e366-421e-b00d-9c453da1adca" (UID: "7b7d9f3f-e366-421e-b00d-9c453da1adca"). InnerVolumeSpecName "kube-api-access-g2tvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.784849 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-scripts" (OuterVolumeSpecName: "scripts") pod "7b7d9f3f-e366-421e-b00d-9c453da1adca" (UID: "7b7d9f3f-e366-421e-b00d-9c453da1adca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.792121 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b7d9f3f-e366-421e-b00d-9c453da1adca" (UID: "7b7d9f3f-e366-421e-b00d-9c453da1adca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.861506 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tvv\" (UniqueName: \"kubernetes.io/projected/7b7d9f3f-e366-421e-b00d-9c453da1adca-kube-api-access-g2tvv\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.861803 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.861928 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b7d9f3f-e366-421e-b00d-9c453da1adca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.862026 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.897893 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b7d9f3f-e366-421e-b00d-9c453da1adca" (UID: "7b7d9f3f-e366-421e-b00d-9c453da1adca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.921553 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data" (OuterVolumeSpecName: "config-data") pod "7b7d9f3f-e366-421e-b00d-9c453da1adca" (UID: "7b7d9f3f-e366-421e-b00d-9c453da1adca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.963916 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:04 crc kubenswrapper[4772]: I0127 15:28:04.964229 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7d9f3f-e366-421e-b00d-9c453da1adca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.337556 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.338674 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b7d9f3f-e366-421e-b00d-9c453da1adca","Type":"ContainerDied","Data":"60d07ef91bf8ed929ad3a7649d54a1c26d24e5b004c0a1e93cdd8cb08880d285"} Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.338739 4772 scope.go:117] "RemoveContainer" containerID="7232aaea46edf2977ebe24eeb1188331c23b40d3efeaaca7dfc75e3658d209b6" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.374529 4772 scope.go:117] "RemoveContainer" containerID="55338fb54abd9ad2a096debb2356f749682191abe6b851127e0e95fcec09a654" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.375910 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.383431 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.465254 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:28:05 crc kubenswrapper[4772]: E0127 15:28:05.465988 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="cinder-scheduler" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.466011 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="cinder-scheduler" Jan 27 15:28:05 crc kubenswrapper[4772]: E0127 15:28:05.466059 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="probe" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.466070 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="probe" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.467576 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="probe" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.467615 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" containerName="cinder-scheduler" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.476501 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.478884 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-scripts\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.479005 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.479042 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrz6x\" (UniqueName: \"kubernetes.io/projected/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-kube-api-access-rrz6x\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.479112 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.479151 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.479198 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.484974 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.489714 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.560906 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.579903 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.579951 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrz6x\" (UniqueName: \"kubernetes.io/projected/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-kube-api-access-rrz6x\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.580007 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.580045 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.580068 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.580104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-scripts\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.580432 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.597038 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-scripts\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.599715 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.613717 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.628893 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrz6x\" (UniqueName: \"kubernetes.io/projected/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-kube-api-access-rrz6x\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.636906 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data\") pod \"cinder-scheduler-0\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " pod="openstack/cinder-scheduler-0" Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.647698 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cb9d976b-flrwl"] Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.647969 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cb9d976b-flrwl" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api-log" containerID="cri-o://6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab" gracePeriod=30 Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.648464 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cb9d976b-flrwl" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api" containerID="cri-o://51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3" gracePeriod=30 Jan 27 15:28:05 crc kubenswrapper[4772]: I0127 15:28:05.824136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.321150 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:28:06 crc kubenswrapper[4772]: W0127 15:28:06.343196 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod683f458e_44e9_49ea_a66b_4ac91a3f2bc1.slice/crio-bb200c044803c6c5491d60dc192f271f4cdf0adcf18a5f0f12ab40acb77fdf72 WatchSource:0}: Error finding container bb200c044803c6c5491d60dc192f271f4cdf0adcf18a5f0f12ab40acb77fdf72: Status 404 returned error can't find the container with id bb200c044803c6c5491d60dc192f271f4cdf0adcf18a5f0f12ab40acb77fdf72 Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.377048 4772 generic.go:334] "Generic (PLEG): container finished" podID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerID="6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab" exitCode=143 Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.377285 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb9d976b-flrwl" event={"ID":"a02a1b6c-d438-42bf-a577-88bbbcca2a00","Type":"ContainerDied","Data":"6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab"} Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.545649 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.546551 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-central-agent" containerID="cri-o://941b08834b6f5b8dafbc182c67d3e458a94c7299ea32b8afd698f876b68ea015" gracePeriod=30 Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.547675 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="proxy-httpd" containerID="cri-o://851c7b3936a50d21408fbed0918adde539924e5915ec73fdcccd952a3392565b" gracePeriod=30 Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.547712 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-notification-agent" containerID="cri-o://99f37d09a547a41878834ffbee7e0a0b90552016b42313fae983e9915266d761" gracePeriod=30 Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.547792 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="sg-core" containerID="cri-o://ea8686b5fbb3cb04fd3d0cb81bec48b421aa5e6e9be9af4a4ad0ccc951c6bce4" gracePeriod=30 Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.567549 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.154:3000/\": EOF" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.684323 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7d9f3f-e366-421e-b00d-9c453da1adca" path="/var/lib/kubelet/pods/7b7d9f3f-e366-421e-b00d-9c453da1adca/volumes" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.762830 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d86f6cfbc-cwfmc"] Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.769809 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.777964 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.778460 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.778315 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.791334 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d86f6cfbc-cwfmc"] Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917243 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-public-tls-certs\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917300 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-run-httpd\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-combined-ca-bundle\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917399 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-config-data\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-internal-tls-certs\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917446 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzf2\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-kube-api-access-pfzf2\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917497 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-etc-swift\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:06 crc kubenswrapper[4772]: I0127 15:28:06.917515 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-log-httpd\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019529 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzf2\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-kube-api-access-pfzf2\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019611 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-etc-swift\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019644 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-log-httpd\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019723 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-public-tls-certs\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019749 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-run-httpd\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019817 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-combined-ca-bundle\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019890 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-config-data\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.019949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-internal-tls-certs\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.023681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-log-httpd\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.023752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-run-httpd\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.026152 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-internal-tls-certs\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.026423 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-etc-swift\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.026745 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-public-tls-certs\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.028044 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-combined-ca-bundle\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.028961 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-config-data\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.042191 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzf2\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-kube-api-access-pfzf2\") pod \"swift-proxy-d86f6cfbc-cwfmc\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.111156 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.415554 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"683f458e-44e9-49ea-a66b-4ac91a3f2bc1","Type":"ContainerStarted","Data":"3e806373a2604b5465de7a3913d6865c82f0689bac61f26c430950d7d4efb948"} Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.416286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"683f458e-44e9-49ea-a66b-4ac91a3f2bc1","Type":"ContainerStarted","Data":"bb200c044803c6c5491d60dc192f271f4cdf0adcf18a5f0f12ab40acb77fdf72"} Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432693 4772 generic.go:334] "Generic (PLEG): container finished" podID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerID="851c7b3936a50d21408fbed0918adde539924e5915ec73fdcccd952a3392565b" exitCode=0 Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432737 4772 generic.go:334] "Generic (PLEG): container finished" podID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerID="ea8686b5fbb3cb04fd3d0cb81bec48b421aa5e6e9be9af4a4ad0ccc951c6bce4" exitCode=2 Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432745 4772 generic.go:334] "Generic (PLEG): container finished" podID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerID="99f37d09a547a41878834ffbee7e0a0b90552016b42313fae983e9915266d761" exitCode=0 Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432752 4772 generic.go:334] "Generic (PLEG): container finished" podID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerID="941b08834b6f5b8dafbc182c67d3e458a94c7299ea32b8afd698f876b68ea015" exitCode=0 Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerDied","Data":"851c7b3936a50d21408fbed0918adde539924e5915ec73fdcccd952a3392565b"} Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432814 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerDied","Data":"ea8686b5fbb3cb04fd3d0cb81bec48b421aa5e6e9be9af4a4ad0ccc951c6bce4"} Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432825 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerDied","Data":"99f37d09a547a41878834ffbee7e0a0b90552016b42313fae983e9915266d761"} Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.432833 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerDied","Data":"941b08834b6f5b8dafbc182c67d3e458a94c7299ea32b8afd698f876b68ea015"} Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.726656 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.769099 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d86f6cfbc-cwfmc"] Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847473 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-sg-core-conf-yaml\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847524 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-run-httpd\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847565 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-scripts\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847584 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-config-data\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847649 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-combined-ca-bundle\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-log-httpd\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.847755 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjsh\" (UniqueName: \"kubernetes.io/projected/f8583377-67ef-4cca-83bb-08d7523ab0a8-kube-api-access-9jjsh\") pod \"f8583377-67ef-4cca-83bb-08d7523ab0a8\" (UID: \"f8583377-67ef-4cca-83bb-08d7523ab0a8\") " Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.850502 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.852923 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.857381 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8583377-67ef-4cca-83bb-08d7523ab0a8-kube-api-access-9jjsh" (OuterVolumeSpecName: "kube-api-access-9jjsh") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "kube-api-access-9jjsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.858185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-scripts" (OuterVolumeSpecName: "scripts") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.922665 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.949823 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.949864 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.949876 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.949887 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f8583377-67ef-4cca-83bb-08d7523ab0a8-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:07 crc kubenswrapper[4772]: I0127 15:28:07.949897 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjsh\" (UniqueName: \"kubernetes.io/projected/f8583377-67ef-4cca-83bb-08d7523ab0a8-kube-api-access-9jjsh\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.001467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.029277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-config-data" (OuterVolumeSpecName: "config-data") pod "f8583377-67ef-4cca-83bb-08d7523ab0a8" (UID: "f8583377-67ef-4cca-83bb-08d7523ab0a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.051761 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.052003 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8583377-67ef-4cca-83bb-08d7523ab0a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.480499 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" event={"ID":"c16a29a0-7238-4a5e-b892-8f5195a1f486","Type":"ContainerStarted","Data":"47a1d8c4913044388b407e6a5c05783d2d3731216d7862873425d28265a5fe05"} Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.480771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" event={"ID":"c16a29a0-7238-4a5e-b892-8f5195a1f486","Type":"ContainerStarted","Data":"a476d84a3741734575b073569a645d9d973c5cdbb39812aa454a7257859db22b"} Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.480782 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" event={"ID":"c16a29a0-7238-4a5e-b892-8f5195a1f486","Type":"ContainerStarted","Data":"92e9170b2797b87fe5816f61d1944a7f0cca88f2e0e21f7420f27a5ed25d4005"} Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.501503 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f8583377-67ef-4cca-83bb-08d7523ab0a8","Type":"ContainerDied","Data":"561a737e7b03fa9b4f9fd3fd05ad062a1b5523a6f22e1965b09d595f10adafd3"} Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.501565 4772 scope.go:117] "RemoveContainer" containerID="851c7b3936a50d21408fbed0918adde539924e5915ec73fdcccd952a3392565b" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.501741 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.542147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"683f458e-44e9-49ea-a66b-4ac91a3f2bc1","Type":"ContainerStarted","Data":"112ddc6068b3694383f83c1ffece42788a7623920d1c02ff9f46202f7c8c0d7e"} Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.550079 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.597129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.608995 4772 scope.go:117] "RemoveContainer" containerID="ea8686b5fbb3cb04fd3d0cb81bec48b421aa5e6e9be9af4a4ad0ccc951c6bce4" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.642175 4772 scope.go:117] "RemoveContainer" containerID="99f37d09a547a41878834ffbee7e0a0b90552016b42313fae983e9915266d761" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.650409 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.692233 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" path="/var/lib/kubelet/pods/f8583377-67ef-4cca-83bb-08d7523ab0a8/volumes" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.692992 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:08 crc kubenswrapper[4772]: E0127 15:28:08.693348 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="sg-core" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693361 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="sg-core" Jan 27 15:28:08 crc kubenswrapper[4772]: E0127 15:28:08.693375 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-notification-agent" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693381 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-notification-agent" Jan 27 15:28:08 crc kubenswrapper[4772]: E0127 15:28:08.693399 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="proxy-httpd" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693404 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="proxy-httpd" Jan 27 15:28:08 crc kubenswrapper[4772]: E0127 15:28:08.693421 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-central-agent" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693427 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-central-agent" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693596 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="sg-core" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693607 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-central-agent" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693621 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="proxy-httpd" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.693629 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8583377-67ef-4cca-83bb-08d7523ab0a8" containerName="ceilometer-notification-agent" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.695857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.697832 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.698916 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.698269 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.698257854 podStartE2EDuration="3.698257854s" podCreationTimestamp="2026-01-27 15:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:08.574045412 +0000 UTC m=+1274.554654510" watchObservedRunningTime="2026-01-27 15:28:08.698257854 +0000 UTC m=+1274.678866952" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.722198 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.752706 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dqgvx"] Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.752925 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" podUID="17a547a9-a098-43b7-a153-ad9a137369de" containerName="dnsmasq-dns" containerID="cri-o://9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac" gracePeriod=10 Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.757756 4772 scope.go:117] "RemoveContainer" containerID="941b08834b6f5b8dafbc182c67d3e458a94c7299ea32b8afd698f876b68ea015" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776209 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhrl9\" (UniqueName: \"kubernetes.io/projected/65e74d64-83f8-4964-8950-bf76816dd5fc-kube-api-access-nhrl9\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-config-data\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776585 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-scripts\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776636 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776681 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-run-httpd\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.776738 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-log-httpd\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.862782 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cb9d976b-flrwl" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:49270->10.217.0.158:9311: read: connection reset by peer" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.863142 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cb9d976b-flrwl" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:49258->10.217.0.158:9311: read: connection reset by peer" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.878114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhrl9\" (UniqueName: \"kubernetes.io/projected/65e74d64-83f8-4964-8950-bf76816dd5fc-kube-api-access-nhrl9\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.878729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-config-data\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.879561 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-scripts\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.880378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.880580 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.880770 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-run-httpd\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.880941 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-log-httpd\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.881518 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-log-httpd\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.883051 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-run-httpd\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.884809 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-config-data\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.887804 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-scripts\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.889203 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.893023 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:08 crc kubenswrapper[4772]: I0127 15:28:08.904252 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhrl9\" (UniqueName: \"kubernetes.io/projected/65e74d64-83f8-4964-8950-bf76816dd5fc-kube-api-access-nhrl9\") pod \"ceilometer-0\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " pod="openstack/ceilometer-0" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.031988 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.466041 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.583367 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.589210 4772 generic.go:334] "Generic (PLEG): container finished" podID="17a547a9-a098-43b7-a153-ad9a137369de" containerID="9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac" exitCode=0 Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.589265 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" event={"ID":"17a547a9-a098-43b7-a153-ad9a137369de","Type":"ContainerDied","Data":"9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac"} Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.589292 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" event={"ID":"17a547a9-a098-43b7-a153-ad9a137369de","Type":"ContainerDied","Data":"31e83f6ba26ca249b5435d61c8786bdc24b0777adb10cbb234cdaacbda3e0db7"} Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.589308 4772 scope.go:117] "RemoveContainer" containerID="9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.589415 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-dqgvx" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.604013 4772 generic.go:334] "Generic (PLEG): container finished" podID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerID="51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3" exitCode=0 Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.604940 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb9d976b-flrwl" event={"ID":"a02a1b6c-d438-42bf-a577-88bbbcca2a00","Type":"ContainerDied","Data":"51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3"} Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.604997 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb9d976b-flrwl" event={"ID":"a02a1b6c-d438-42bf-a577-88bbbcca2a00","Type":"ContainerDied","Data":"37bb9f56f01e20e1b9f2066e0e10a19c1b316c11b490381320b663b46a9cc874"} Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.605180 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb9d976b-flrwl" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.605426 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.605448 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.607127 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-sb\") pod \"17a547a9-a098-43b7-a153-ad9a137369de\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.607253 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-nb\") pod \"17a547a9-a098-43b7-a153-ad9a137369de\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.607299 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-config\") pod \"17a547a9-a098-43b7-a153-ad9a137369de\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.607325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-svc\") pod \"17a547a9-a098-43b7-a153-ad9a137369de\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.607384 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-swift-storage-0\") pod \"17a547a9-a098-43b7-a153-ad9a137369de\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.607436 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzhkt\" (UniqueName: \"kubernetes.io/projected/17a547a9-a098-43b7-a153-ad9a137369de-kube-api-access-xzhkt\") pod \"17a547a9-a098-43b7-a153-ad9a137369de\" (UID: \"17a547a9-a098-43b7-a153-ad9a137369de\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.631580 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17a547a9-a098-43b7-a153-ad9a137369de-kube-api-access-xzhkt" (OuterVolumeSpecName: "kube-api-access-xzhkt") pod "17a547a9-a098-43b7-a153-ad9a137369de" (UID: "17a547a9-a098-43b7-a153-ad9a137369de"). InnerVolumeSpecName "kube-api-access-xzhkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.682266 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" podStartSLOduration=3.682240335 podStartE2EDuration="3.682240335s" podCreationTimestamp="2026-01-27 15:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:09.644089806 +0000 UTC m=+1275.624698924" watchObservedRunningTime="2026-01-27 15:28:09.682240335 +0000 UTC m=+1275.662849443" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.709987 4772 scope.go:117] "RemoveContainer" containerID="dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.712261 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data\") pod \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.712307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data-custom\") pod \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.712389 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-combined-ca-bundle\") pod \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.712461 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbjvx\" (UniqueName: \"kubernetes.io/projected/a02a1b6c-d438-42bf-a577-88bbbcca2a00-kube-api-access-rbjvx\") pod \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.712546 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a02a1b6c-d438-42bf-a577-88bbbcca2a00-logs\") pod \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\" (UID: \"a02a1b6c-d438-42bf-a577-88bbbcca2a00\") " Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.713208 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzhkt\" (UniqueName: \"kubernetes.io/projected/17a547a9-a098-43b7-a153-ad9a137369de-kube-api-access-xzhkt\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.732344 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17a547a9-a098-43b7-a153-ad9a137369de" (UID: "17a547a9-a098-43b7-a153-ad9a137369de"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.737891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a02a1b6c-d438-42bf-a577-88bbbcca2a00" (UID: "a02a1b6c-d438-42bf-a577-88bbbcca2a00"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.738539 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a02a1b6c-d438-42bf-a577-88bbbcca2a00-logs" (OuterVolumeSpecName: "logs") pod "a02a1b6c-d438-42bf-a577-88bbbcca2a00" (UID: "a02a1b6c-d438-42bf-a577-88bbbcca2a00"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.750970 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02a1b6c-d438-42bf-a577-88bbbcca2a00-kube-api-access-rbjvx" (OuterVolumeSpecName: "kube-api-access-rbjvx") pod "a02a1b6c-d438-42bf-a577-88bbbcca2a00" (UID: "a02a1b6c-d438-42bf-a577-88bbbcca2a00"). InnerVolumeSpecName "kube-api-access-rbjvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.758132 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17a547a9-a098-43b7-a153-ad9a137369de" (UID: "17a547a9-a098-43b7-a153-ad9a137369de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.759099 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.782525 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17a547a9-a098-43b7-a153-ad9a137369de" (UID: "17a547a9-a098-43b7-a153-ad9a137369de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.783311 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a02a1b6c-d438-42bf-a577-88bbbcca2a00" (UID: "a02a1b6c-d438-42bf-a577-88bbbcca2a00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.805383 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-config" (OuterVolumeSpecName: "config") pod "17a547a9-a098-43b7-a153-ad9a137369de" (UID: "17a547a9-a098-43b7-a153-ad9a137369de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.824665 4772 scope.go:117] "RemoveContainer" containerID="9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.830823 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17a547a9-a098-43b7-a153-ad9a137369de" (UID: "17a547a9-a098-43b7-a153-ad9a137369de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.831954 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.832068 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbjvx\" (UniqueName: \"kubernetes.io/projected/a02a1b6c-d438-42bf-a577-88bbbcca2a00-kube-api-access-rbjvx\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.832146 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.833025 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a02a1b6c-d438-42bf-a577-88bbbcca2a00-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.833116 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.833208 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.833383 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.833461 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17a547a9-a098-43b7-a153-ad9a137369de-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.833536 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: E0127 15:28:09.837937 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac\": container with ID starting with 9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac not found: ID does not exist" containerID="9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.837994 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac"} err="failed to get container status \"9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac\": rpc error: code = NotFound desc = could not find container \"9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac\": container with ID starting with 9b585b108c0dc492c1e4b07fece64b278518a1b12e04b9e57c9b27d9183ca9ac not found: ID does not exist" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.838028 4772 scope.go:117] "RemoveContainer" containerID="dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023" Jan 27 15:28:09 crc kubenswrapper[4772]: E0127 15:28:09.838627 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023\": container with ID starting with dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023 not found: ID does not exist" containerID="dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.838651 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023"} err="failed to get container status \"dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023\": rpc error: code = NotFound desc = could not find container \"dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023\": container with ID starting with dbf85c4247c8ba3d5d079a5efed1ba22279b244d404fd6795f678db691f0b023 not found: ID does not exist" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.838672 4772 scope.go:117] "RemoveContainer" containerID="51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.875510 4772 scope.go:117] "RemoveContainer" containerID="6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.900901 4772 scope.go:117] "RemoveContainer" containerID="51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3" Jan 27 15:28:09 crc kubenswrapper[4772]: E0127 15:28:09.901616 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3\": container with ID starting with 51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3 not found: ID does not exist" containerID="51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.901788 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3"} err="failed to get container status \"51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3\": rpc error: code = NotFound desc = could not find container \"51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3\": container with ID starting with 51682243156b2ae0619aa3431049d4b3ccf7b64abb3cf0a6f8d8cabce6c5c3f3 not found: ID does not exist" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.902032 4772 scope.go:117] "RemoveContainer" containerID="6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab" Jan 27 15:28:09 crc kubenswrapper[4772]: E0127 15:28:09.902473 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab\": container with ID starting with 6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab not found: ID does not exist" containerID="6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.902542 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab"} err="failed to get container status \"6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab\": rpc error: code = NotFound desc = could not find container \"6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab\": container with ID starting with 6263f4825b44d3954903ca3307e160ecf8edb8b8f916573aa23ea8b2efbc78ab not found: ID does not exist" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.916727 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data" (OuterVolumeSpecName: "config-data") pod "a02a1b6c-d438-42bf-a577-88bbbcca2a00" (UID: "a02a1b6c-d438-42bf-a577-88bbbcca2a00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.932182 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dqgvx"] Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.935484 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a02a1b6c-d438-42bf-a577-88bbbcca2a00-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:09 crc kubenswrapper[4772]: I0127 15:28:09.942234 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-dqgvx"] Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.266297 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cb9d976b-flrwl"] Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.283507 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-cb9d976b-flrwl"] Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.641758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerStarted","Data":"efec7f44770a0bed43e7dc53d45c9b414c83392d3729ddafeb636db96612decb"} Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.675972 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17a547a9-a098-43b7-a153-ad9a137369de" path="/var/lib/kubelet/pods/17a547a9-a098-43b7-a153-ad9a137369de/volumes" Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.676866 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" path="/var/lib/kubelet/pods/a02a1b6c-d438-42bf-a577-88bbbcca2a00/volumes" Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.762996 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 15:28:10 crc kubenswrapper[4772]: I0127 15:28:10.824514 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 15:28:11 crc kubenswrapper[4772]: I0127 15:28:11.657761 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerStarted","Data":"96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96"} Jan 27 15:28:12 crc kubenswrapper[4772]: I0127 15:28:12.683617 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerStarted","Data":"902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b"} Jan 27 15:28:13 crc kubenswrapper[4772]: I0127 15:28:13.738077 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:28:13 crc kubenswrapper[4772]: I0127 15:28:13.738686 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-log" containerID="cri-o://c52299828ac41e83b1686de53ba3808d1e810b20370ec9d5bc6e9bbc6b64bbed" gracePeriod=30 Jan 27 15:28:13 crc kubenswrapper[4772]: I0127 15:28:13.739141 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-httpd" containerID="cri-o://9775d2c5b4eda3cae695814a686a4a82d4426bf3d7d28a73dffa9b807c4c16b8" gracePeriod=30 Jan 27 15:28:14 crc kubenswrapper[4772]: I0127 15:28:14.697801 4772 generic.go:334] "Generic (PLEG): container finished" podID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerID="c52299828ac41e83b1686de53ba3808d1e810b20370ec9d5bc6e9bbc6b64bbed" exitCode=143 Jan 27 15:28:14 crc kubenswrapper[4772]: I0127 15:28:14.697891 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c94a7cfa-28e2-4d52-85a1-d5586f162227","Type":"ContainerDied","Data":"c52299828ac41e83b1686de53ba3808d1e810b20370ec9d5bc6e9bbc6b64bbed"} Jan 27 15:28:16 crc kubenswrapper[4772]: I0127 15:28:16.076181 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 15:28:16 crc kubenswrapper[4772]: I0127 15:28:16.601806 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:28:16 crc kubenswrapper[4772]: I0127 15:28:16.602422 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-log" containerID="cri-o://58d128e4a7f44cc529be47e9f224989cce3b8a08dc4e4f4d37d49e38c0c7b8d2" gracePeriod=30 Jan 27 15:28:16 crc kubenswrapper[4772]: I0127 15:28:16.602493 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-httpd" containerID="cri-o://f1accbd1db4a8c2dce7512a2eb2abaa265e29ed373b0fc121d29515c5bba0e55" gracePeriod=30 Jan 27 15:28:17 crc kubenswrapper[4772]: I0127 15:28:17.125825 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:17 crc kubenswrapper[4772]: I0127 15:28:17.126731 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:28:17 crc kubenswrapper[4772]: I0127 15:28:17.731541 4772 generic.go:334] "Generic (PLEG): container finished" podID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerID="9775d2c5b4eda3cae695814a686a4a82d4426bf3d7d28a73dffa9b807c4c16b8" exitCode=0 Jan 27 15:28:17 crc kubenswrapper[4772]: I0127 15:28:17.731683 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c94a7cfa-28e2-4d52-85a1-d5586f162227","Type":"ContainerDied","Data":"9775d2c5b4eda3cae695814a686a4a82d4426bf3d7d28a73dffa9b807c4c16b8"} Jan 27 15:28:17 crc kubenswrapper[4772]: I0127 15:28:17.743026 4772 generic.go:334] "Generic (PLEG): container finished" podID="41f85a83-f245-40ff-b994-50cab01b2530" containerID="58d128e4a7f44cc529be47e9f224989cce3b8a08dc4e4f4d37d49e38c0c7b8d2" exitCode=143 Jan 27 15:28:17 crc kubenswrapper[4772]: I0127 15:28:17.743408 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41f85a83-f245-40ff-b994-50cab01b2530","Type":"ContainerDied","Data":"58d128e4a7f44cc529be47e9f224989cce3b8a08dc4e4f4d37d49e38c0c7b8d2"} Jan 27 15:28:18 crc kubenswrapper[4772]: I0127 15:28:18.765806 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.653852 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pszgr"] Jan 27 15:28:19 crc kubenswrapper[4772]: E0127 15:28:19.664383 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664430 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api" Jan 27 15:28:19 crc kubenswrapper[4772]: E0127 15:28:19.664471 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a547a9-a098-43b7-a153-ad9a137369de" containerName="dnsmasq-dns" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664480 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a547a9-a098-43b7-a153-ad9a137369de" containerName="dnsmasq-dns" Jan 27 15:28:19 crc kubenswrapper[4772]: E0127 15:28:19.664518 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api-log" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664527 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api-log" Jan 27 15:28:19 crc kubenswrapper[4772]: E0127 15:28:19.664574 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17a547a9-a098-43b7-a153-ad9a137369de" containerName="init" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664582 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="17a547a9-a098-43b7-a153-ad9a137369de" containerName="init" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664943 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664969 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="17a547a9-a098-43b7-a153-ad9a137369de" containerName="dnsmasq-dns" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.664979 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02a1b6c-d438-42bf-a577-88bbbcca2a00" containerName="barbican-api-log" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.665902 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.668618 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pszgr"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.748059 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-gbrww"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.749715 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.750674 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54bbbf38-088b-4e4d-8154-569667fcf9a9-operator-scripts\") pod \"nova-api-db-create-pszgr\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.750821 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwt4l\" (UniqueName: \"kubernetes.io/projected/54bbbf38-088b-4e4d-8154-569667fcf9a9-kube-api-access-vwt4l\") pod \"nova-api-db-create-pszgr\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.759812 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gbrww"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.775351 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5104-account-create-update-vp7x7"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.776480 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.784520 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.834272 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5104-account-create-update-vp7x7"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.853397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-operator-scripts\") pod \"nova-api-5104-account-create-update-vp7x7\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.853475 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54bbbf38-088b-4e4d-8154-569667fcf9a9-operator-scripts\") pod \"nova-api-db-create-pszgr\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.853583 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwt4l\" (UniqueName: \"kubernetes.io/projected/54bbbf38-088b-4e4d-8154-569667fcf9a9-kube-api-access-vwt4l\") pod \"nova-api-db-create-pszgr\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.853613 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be888039-f158-4d05-9f7d-6d01b2478b08-operator-scripts\") pod \"nova-cell0-db-create-gbrww\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.853695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvqvj\" (UniqueName: \"kubernetes.io/projected/be888039-f158-4d05-9f7d-6d01b2478b08-kube-api-access-pvqvj\") pod \"nova-cell0-db-create-gbrww\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.853749 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtnzx\" (UniqueName: \"kubernetes.io/projected/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-kube-api-access-dtnzx\") pod \"nova-api-5104-account-create-update-vp7x7\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.854497 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54bbbf38-088b-4e4d-8154-569667fcf9a9-operator-scripts\") pod \"nova-api-db-create-pszgr\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.874560 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwt4l\" (UniqueName: \"kubernetes.io/projected/54bbbf38-088b-4e4d-8154-569667fcf9a9-kube-api-access-vwt4l\") pod \"nova-api-db-create-pszgr\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.954860 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-operator-scripts\") pod \"nova-api-5104-account-create-update-vp7x7\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.954980 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be888039-f158-4d05-9f7d-6d01b2478b08-operator-scripts\") pod \"nova-cell0-db-create-gbrww\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.955062 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvqvj\" (UniqueName: \"kubernetes.io/projected/be888039-f158-4d05-9f7d-6d01b2478b08-kube-api-access-pvqvj\") pod \"nova-cell0-db-create-gbrww\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.955119 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtnzx\" (UniqueName: \"kubernetes.io/projected/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-kube-api-access-dtnzx\") pod \"nova-api-5104-account-create-update-vp7x7\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.955808 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6w7p7"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.955848 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-operator-scripts\") pod \"nova-api-5104-account-create-update-vp7x7\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.956031 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be888039-f158-4d05-9f7d-6d01b2478b08-operator-scripts\") pod \"nova-cell0-db-create-gbrww\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.957913 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.975668 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6w7p7"] Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.985466 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtnzx\" (UniqueName: \"kubernetes.io/projected/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-kube-api-access-dtnzx\") pod \"nova-api-5104-account-create-update-vp7x7\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.998546 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:19 crc kubenswrapper[4772]: I0127 15:28:19.999006 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvqvj\" (UniqueName: \"kubernetes.io/projected/be888039-f158-4d05-9f7d-6d01b2478b08-kube-api-access-pvqvj\") pod \"nova-cell0-db-create-gbrww\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.015870 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6af8-account-create-update-ltwnh"] Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.017423 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.021428 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.055861 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564de425-5170-45df-9080-5b02579483ee-operator-scripts\") pod \"nova-cell0-6af8-account-create-update-ltwnh\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.055916 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-operator-scripts\") pod \"nova-cell1-db-create-6w7p7\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.056008 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znsll\" (UniqueName: \"kubernetes.io/projected/564de425-5170-45df-9080-5b02579483ee-kube-api-access-znsll\") pod \"nova-cell0-6af8-account-create-update-ltwnh\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.056117 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv946\" (UniqueName: \"kubernetes.io/projected/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-kube-api-access-hv946\") pod \"nova-cell1-db-create-6w7p7\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.062817 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6af8-account-create-update-ltwnh"] Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.070224 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.102929 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.158819 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564de425-5170-45df-9080-5b02579483ee-operator-scripts\") pod \"nova-cell0-6af8-account-create-update-ltwnh\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.158877 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-operator-scripts\") pod \"nova-cell1-db-create-6w7p7\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.158929 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znsll\" (UniqueName: \"kubernetes.io/projected/564de425-5170-45df-9080-5b02579483ee-kube-api-access-znsll\") pod \"nova-cell0-6af8-account-create-update-ltwnh\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.158996 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv946\" (UniqueName: \"kubernetes.io/projected/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-kube-api-access-hv946\") pod \"nova-cell1-db-create-6w7p7\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.160640 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564de425-5170-45df-9080-5b02579483ee-operator-scripts\") pod \"nova-cell0-6af8-account-create-update-ltwnh\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.161257 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-operator-scripts\") pod \"nova-cell1-db-create-6w7p7\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.162422 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-12a3-account-create-update-mdv84"] Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.163492 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.168223 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.183230 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-12a3-account-create-update-mdv84"] Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.194445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znsll\" (UniqueName: \"kubernetes.io/projected/564de425-5170-45df-9080-5b02579483ee-kube-api-access-znsll\") pod \"nova-cell0-6af8-account-create-update-ltwnh\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.199655 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv946\" (UniqueName: \"kubernetes.io/projected/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-kube-api-access-hv946\") pod \"nova-cell1-db-create-6w7p7\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.262129 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878xd\" (UniqueName: \"kubernetes.io/projected/08d7e14a-70d3-446e-8250-ca1047b5bc4b-kube-api-access-878xd\") pod \"nova-cell1-12a3-account-create-update-mdv84\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.262310 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08d7e14a-70d3-446e-8250-ca1047b5bc4b-operator-scripts\") pod \"nova-cell1-12a3-account-create-update-mdv84\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.279890 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.363898 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-878xd\" (UniqueName: \"kubernetes.io/projected/08d7e14a-70d3-446e-8250-ca1047b5bc4b-kube-api-access-878xd\") pod \"nova-cell1-12a3-account-create-update-mdv84\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.364424 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08d7e14a-70d3-446e-8250-ca1047b5bc4b-operator-scripts\") pod \"nova-cell1-12a3-account-create-update-mdv84\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.365288 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08d7e14a-70d3-446e-8250-ca1047b5bc4b-operator-scripts\") pod \"nova-cell1-12a3-account-create-update-mdv84\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.385864 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-878xd\" (UniqueName: \"kubernetes.io/projected/08d7e14a-70d3-446e-8250-ca1047b5bc4b-kube-api-access-878xd\") pod \"nova-cell1-12a3-account-create-update-mdv84\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.396707 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.571942 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.879772 4772 generic.go:334] "Generic (PLEG): container finished" podID="41f85a83-f245-40ff-b994-50cab01b2530" containerID="f1accbd1db4a8c2dce7512a2eb2abaa265e29ed373b0fc121d29515c5bba0e55" exitCode=0 Jan 27 15:28:20 crc kubenswrapper[4772]: I0127 15:28:20.879880 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41f85a83-f245-40ff-b994-50cab01b2530","Type":"ContainerDied","Data":"f1accbd1db4a8c2dce7512a2eb2abaa265e29ed373b0fc121d29515c5bba0e55"} Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.196720 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.198147 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q49rw\" (UniqueName: \"kubernetes.io/projected/c94a7cfa-28e2-4d52-85a1-d5586f162227-kube-api-access-q49rw\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.198231 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-combined-ca-bundle\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.198319 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.200058 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.199051 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-httpd-run\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.201210 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.209223 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c94a7cfa-28e2-4d52-85a1-d5586f162227-kube-api-access-q49rw" (OuterVolumeSpecName: "kube-api-access-q49rw") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "kube-api-access-q49rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.212067 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.273992 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.303630 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-logs\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.303723 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-public-tls-certs\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.303907 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-config-data\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.303959 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-scripts\") pod \"c94a7cfa-28e2-4d52-85a1-d5586f162227\" (UID: \"c94a7cfa-28e2-4d52-85a1-d5586f162227\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.304690 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q49rw\" (UniqueName: \"kubernetes.io/projected/c94a7cfa-28e2-4d52-85a1-d5586f162227-kube-api-access-q49rw\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.304712 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.304738 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.306301 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-logs" (OuterVolumeSpecName: "logs") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.312411 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-scripts" (OuterVolumeSpecName: "scripts") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.405652 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.405682 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c94a7cfa-28e2-4d52-85a1-d5586f162227-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.414682 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.442374 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-config-data" (OuterVolumeSpecName: "config-data") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.446089 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c94a7cfa-28e2-4d52-85a1-d5586f162227" (UID: "c94a7cfa-28e2-4d52-85a1-d5586f162227"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.460255 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506392 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506473 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-logs\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506501 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-httpd-run\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506557 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9t4\" (UniqueName: \"kubernetes.io/projected/41f85a83-f245-40ff-b994-50cab01b2530-kube-api-access-pc9t4\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506584 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-config-data\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506614 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-internal-tls-certs\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506682 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-combined-ca-bundle\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506714 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-scripts\") pod \"41f85a83-f245-40ff-b994-50cab01b2530\" (UID: \"41f85a83-f245-40ff-b994-50cab01b2530\") " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506969 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.506994 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.507008 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c94a7cfa-28e2-4d52-85a1-d5586f162227-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.508228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-logs" (OuterVolumeSpecName: "logs") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.508438 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.509958 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.510413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-scripts" (OuterVolumeSpecName: "scripts") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.526709 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f85a83-f245-40ff-b994-50cab01b2530-kube-api-access-pc9t4" (OuterVolumeSpecName: "kube-api-access-pc9t4") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "kube-api-access-pc9t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.554282 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.574092 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-config-data" (OuterVolumeSpecName: "config-data") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.578660 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "41f85a83-f245-40ff-b994-50cab01b2530" (UID: "41f85a83-f245-40ff-b994-50cab01b2530"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608137 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608185 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608197 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/41f85a83-f245-40ff-b994-50cab01b2530-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608207 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc9t4\" (UniqueName: \"kubernetes.io/projected/41f85a83-f245-40ff-b994-50cab01b2530-kube-api-access-pc9t4\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608219 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608226 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608234 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.608242 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f85a83-f245-40ff-b994-50cab01b2530-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.644782 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.711917 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.818969 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5104-account-create-update-vp7x7"] Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.864274 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6af8-account-create-update-ltwnh"] Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.877073 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-gbrww"] Jan 27 15:28:21 crc kubenswrapper[4772]: W0127 15:28:21.890290 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7907cc16_7665_49d3_ad17_f9e6e0fc2f09.slice/crio-5abfcf1bdd92bd3b6c78b0f416ccffb023fa1f4cdaf92b953a574b587cc4a6d0 WatchSource:0}: Error finding container 5abfcf1bdd92bd3b6c78b0f416ccffb023fa1f4cdaf92b953a574b587cc4a6d0: Status 404 returned error can't find the container with id 5abfcf1bdd92bd3b6c78b0f416ccffb023fa1f4cdaf92b953a574b587cc4a6d0 Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.928375 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6w7p7"] Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.943494 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"41f85a83-f245-40ff-b994-50cab01b2530","Type":"ContainerDied","Data":"70507e561102278ed4f801ac168676eb09960026059c30c45c5fe4950449c589"} Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.943542 4772 scope.go:117] "RemoveContainer" containerID="f1accbd1db4a8c2dce7512a2eb2abaa265e29ed373b0fc121d29515c5bba0e55" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.943660 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.951507 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pszgr"] Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.964105 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pszgr" event={"ID":"54bbbf38-088b-4e4d-8154-569667fcf9a9","Type":"ContainerStarted","Data":"303a25d43e818afc0b1deec34d71e454f848a7bcfa271b0aa6ddd404903c4f4b"} Jan 27 15:28:21 crc kubenswrapper[4772]: I0127 15:28:21.989429 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-12a3-account-create-update-mdv84"] Jan 27 15:28:22 crc kubenswrapper[4772]: W0127 15:28:22.053532 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08d7e14a_70d3_446e_8250_ca1047b5bc4b.slice/crio-4ae0384de6c053e3c0b5213f6dd65e86d1254cb54c25b58aa19f1adb13fef254 WatchSource:0}: Error finding container 4ae0384de6c053e3c0b5213f6dd65e86d1254cb54c25b58aa19f1adb13fef254: Status 404 returned error can't find the container with id 4ae0384de6c053e3c0b5213f6dd65e86d1254cb54c25b58aa19f1adb13fef254 Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.053695 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5104-account-create-update-vp7x7" event={"ID":"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f","Type":"ContainerStarted","Data":"9536a6635c6de01fa9cbbeb2b3e4a3db2498f81bff722635d21115dedc7f8ce3"} Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.091381 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0edf6707-14dd-4986-8d64-0e48a31d6a39","Type":"ContainerStarted","Data":"0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa"} Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.125489 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.139375 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.139690 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.139743 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c94a7cfa-28e2-4d52-85a1-d5586f162227","Type":"ContainerDied","Data":"65117f0b87347a480b318a709dc150116a10a8d323bd2553e117803b3054a685"} Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.149340 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: E0127 15:28:22.150433 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-log" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.150550 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-log" Jan 27 15:28:22 crc kubenswrapper[4772]: E0127 15:28:22.150620 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-httpd" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.150686 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-httpd" Jan 27 15:28:22 crc kubenswrapper[4772]: E0127 15:28:22.150743 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-log" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.150791 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-log" Jan 27 15:28:22 crc kubenswrapper[4772]: E0127 15:28:22.150862 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-httpd" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.150920 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-httpd" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.151156 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-log" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.151309 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-httpd" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.151472 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f85a83-f245-40ff-b994-50cab01b2530" containerName="glance-log" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.151553 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-httpd" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.153507 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.158752 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.158805 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.158903 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vd4fn" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.158929 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.171119 4772 scope.go:117] "RemoveContainer" containerID="58d128e4a7f44cc529be47e9f224989cce3b8a08dc4e4f4d37d49e38c0c7b8d2" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.179879 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.201687 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.112610248 podStartE2EDuration="25.201667156s" podCreationTimestamp="2026-01-27 15:27:57 +0000 UTC" firstStartedPulling="2026-01-27 15:27:59.635104725 +0000 UTC m=+1265.615713823" lastFinishedPulling="2026-01-27 15:28:20.724161633 +0000 UTC m=+1286.704770731" observedRunningTime="2026-01-27 15:28:22.15638888 +0000 UTC m=+1288.136997978" watchObservedRunningTime="2026-01-27 15:28:22.201667156 +0000 UTC m=+1288.182276254" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.213006 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerStarted","Data":"2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a"} Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.222192 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" event={"ID":"564de425-5170-45df-9080-5b02579483ee","Type":"ContainerStarted","Data":"deaf013f2c9b59ae86d1e302aff4c0c47992ea99da1873382d3b8f4f077d78da"} Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.232640 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-kube-api-access-rpm5h\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.232920 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gbrww" event={"ID":"be888039-f158-4d05-9f7d-6d01b2478b08","Type":"ContainerStarted","Data":"56740a6f5142d64002a343f18ba92dbb512d9b8a106c7a40a6e41b349e43508e"} Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.237212 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.237342 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.241213 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.241377 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.241414 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.241467 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.241509 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.283096 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.296317 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.319389 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.323031 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.335460 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.336874 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.337606 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.340157 4772 scope.go:117] "RemoveContainer" containerID="9775d2c5b4eda3cae695814a686a4a82d4426bf3d7d28a73dffa9b807c4c16b8" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.343813 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.343937 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.344087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-kube-api-access-rpm5h\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.344571 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.344687 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.344861 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.344999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.347053 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.344515 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.345092 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-logs\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.346953 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.360201 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.361011 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.376937 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.383329 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-kube-api-access-rpm5h\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.388582 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.431955 4772 scope.go:117] "RemoveContainer" containerID="c52299828ac41e83b1686de53ba3808d1e810b20370ec9d5bc6e9bbc6b64bbed" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.438472 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.558866 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.558936 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.558957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.559053 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.559096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.559119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.559136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-logs\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.559208 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghzrk\" (UniqueName: \"kubernetes.io/projected/9a02b617-28a7-4262-a110-f1c71763ad19-kube-api-access-ghzrk\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.601703 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.660838 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661111 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661351 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661456 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661481 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-logs\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661606 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghzrk\" (UniqueName: \"kubernetes.io/projected/9a02b617-28a7-4262-a110-f1c71763ad19-kube-api-access-ghzrk\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661675 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.661832 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.662413 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.662646 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-logs\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.681760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.691036 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f85a83-f245-40ff-b994-50cab01b2530" path="/var/lib/kubelet/pods/41f85a83-f245-40ff-b994-50cab01b2530/volumes" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.692705 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" path="/var/lib/kubelet/pods/c94a7cfa-28e2-4d52-85a1-d5586f162227/volumes" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.707285 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.707397 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.708220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.708619 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghzrk\" (UniqueName: \"kubernetes.io/projected/9a02b617-28a7-4262-a110-f1c71763ad19-kube-api-access-ghzrk\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:22 crc kubenswrapper[4772]: I0127 15:28:22.835951 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " pod="openstack/glance-default-external-api-0" Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.109195 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.244592 4772 generic.go:334] "Generic (PLEG): container finished" podID="54bbbf38-088b-4e4d-8154-569667fcf9a9" containerID="4da9288c82c7401f434d2a53ff336e0d653eb3932d204eafc2869a5860cee4bc" exitCode=0 Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.244669 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pszgr" event={"ID":"54bbbf38-088b-4e4d-8154-569667fcf9a9","Type":"ContainerDied","Data":"4da9288c82c7401f434d2a53ff336e0d653eb3932d204eafc2869a5860cee4bc"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.246089 4772 generic.go:334] "Generic (PLEG): container finished" podID="69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" containerID="e995550ae720943eacfd405b30c920c20d450c9bc6c2389b27261b188859406e" exitCode=0 Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.246176 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5104-account-create-update-vp7x7" event={"ID":"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f","Type":"ContainerDied","Data":"e995550ae720943eacfd405b30c920c20d450c9bc6c2389b27261b188859406e"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.249115 4772 generic.go:334] "Generic (PLEG): container finished" podID="7907cc16-7665-49d3-ad17-f9e6e0fc2f09" containerID="9853bf54eae9ce0f1c3b8ddee31101fe10bc44f0b0f41d495f936c0ac3cc7ec8" exitCode=0 Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.249217 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6w7p7" event={"ID":"7907cc16-7665-49d3-ad17-f9e6e0fc2f09","Type":"ContainerDied","Data":"9853bf54eae9ce0f1c3b8ddee31101fe10bc44f0b0f41d495f936c0ac3cc7ec8"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.249240 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6w7p7" event={"ID":"7907cc16-7665-49d3-ad17-f9e6e0fc2f09","Type":"ContainerStarted","Data":"5abfcf1bdd92bd3b6c78b0f416ccffb023fa1f4cdaf92b953a574b587cc4a6d0"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.250362 4772 generic.go:334] "Generic (PLEG): container finished" podID="08d7e14a-70d3-446e-8250-ca1047b5bc4b" containerID="f28ff63f10f8899bc8cd8fd5a42bd4249a187e430ea97934ef6b489554310751" exitCode=0 Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.250458 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" event={"ID":"08d7e14a-70d3-446e-8250-ca1047b5bc4b","Type":"ContainerDied","Data":"f28ff63f10f8899bc8cd8fd5a42bd4249a187e430ea97934ef6b489554310751"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.250489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" event={"ID":"08d7e14a-70d3-446e-8250-ca1047b5bc4b","Type":"ContainerStarted","Data":"4ae0384de6c053e3c0b5213f6dd65e86d1254cb54c25b58aa19f1adb13fef254"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.252627 4772 generic.go:334] "Generic (PLEG): container finished" podID="564de425-5170-45df-9080-5b02579483ee" containerID="46996df047d6fd10b3034c52a93ce3634cebbfdcb4bf44854f66da5e6d342110" exitCode=0 Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.252679 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" event={"ID":"564de425-5170-45df-9080-5b02579483ee","Type":"ContainerDied","Data":"46996df047d6fd10b3034c52a93ce3634cebbfdcb4bf44854f66da5e6d342110"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.255778 4772 generic.go:334] "Generic (PLEG): container finished" podID="be888039-f158-4d05-9f7d-6d01b2478b08" containerID="e1b312b7631d415f567909a3003da4cdfd7208b6894d1397aa7da34098746b5a" exitCode=0 Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.256614 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gbrww" event={"ID":"be888039-f158-4d05-9f7d-6d01b2478b08","Type":"ContainerDied","Data":"e1b312b7631d415f567909a3003da4cdfd7208b6894d1397aa7da34098746b5a"} Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.373387 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:28:23 crc kubenswrapper[4772]: I0127 15:28:23.719583 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.282117 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerStarted","Data":"db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e"} Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.282276 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-central-agent" containerID="cri-o://96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96" gracePeriod=30 Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.282343 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="sg-core" containerID="cri-o://2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a" gracePeriod=30 Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.282359 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-notification-agent" containerID="cri-o://902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b" gracePeriod=30 Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.282327 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="proxy-httpd" containerID="cri-o://db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e" gracePeriod=30 Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.282769 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.290961 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a02b617-28a7-4262-a110-f1c71763ad19","Type":"ContainerStarted","Data":"1536a68238e83bb2c89cfe9a0fce1841bc4d60d2a518fdc49dc1b005d27a6470"} Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.297971 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343","Type":"ContainerStarted","Data":"3454f9899adaff309b52934e71697924735c1f269fb473444cba03b5baf4e1e5"} Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.298059 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343","Type":"ContainerStarted","Data":"64eb2d8855af54c245dc9d145df3ac0064c424271a5cf4af6c9815a1aa8bc16e"} Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.315213 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.425757034 podStartE2EDuration="16.315150772s" podCreationTimestamp="2026-01-27 15:28:08 +0000 UTC" firstStartedPulling="2026-01-27 15:28:09.811238676 +0000 UTC m=+1275.791847774" lastFinishedPulling="2026-01-27 15:28:23.700632414 +0000 UTC m=+1289.681241512" observedRunningTime="2026-01-27 15:28:24.313945607 +0000 UTC m=+1290.294554715" watchObservedRunningTime="2026-01-27 15:28:24.315150772 +0000 UTC m=+1290.295759880" Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.609906 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.701305 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54bbbf38-088b-4e4d-8154-569667fcf9a9-operator-scripts\") pod \"54bbbf38-088b-4e4d-8154-569667fcf9a9\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.701491 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwt4l\" (UniqueName: \"kubernetes.io/projected/54bbbf38-088b-4e4d-8154-569667fcf9a9-kube-api-access-vwt4l\") pod \"54bbbf38-088b-4e4d-8154-569667fcf9a9\" (UID: \"54bbbf38-088b-4e4d-8154-569667fcf9a9\") " Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.707285 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bbbf38-088b-4e4d-8154-569667fcf9a9-kube-api-access-vwt4l" (OuterVolumeSpecName: "kube-api-access-vwt4l") pod "54bbbf38-088b-4e4d-8154-569667fcf9a9" (UID: "54bbbf38-088b-4e4d-8154-569667fcf9a9"). InnerVolumeSpecName "kube-api-access-vwt4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.714643 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54bbbf38-088b-4e4d-8154-569667fcf9a9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54bbbf38-088b-4e4d-8154-569667fcf9a9" (UID: "54bbbf38-088b-4e4d-8154-569667fcf9a9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.804516 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54bbbf38-088b-4e4d-8154-569667fcf9a9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:24 crc kubenswrapper[4772]: I0127 15:28:24.804866 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwt4l\" (UniqueName: \"kubernetes.io/projected/54bbbf38-088b-4e4d-8154-569667fcf9a9-kube-api-access-vwt4l\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.310950 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.350266 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.354887 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" event={"ID":"08d7e14a-70d3-446e-8250-ca1047b5bc4b","Type":"ContainerDied","Data":"4ae0384de6c053e3c0b5213f6dd65e86d1254cb54c25b58aa19f1adb13fef254"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.354918 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.354920 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ae0384de6c053e3c0b5213f6dd65e86d1254cb54c25b58aa19f1adb13fef254" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.355889 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.364711 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.397265 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerID="db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e" exitCode=0 Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.397311 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerID="2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a" exitCode=2 Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.397326 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerID="96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96" exitCode=0 Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.397400 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerDied","Data":"db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.397428 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerDied","Data":"2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.397441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerDied","Data":"96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.410319 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a02b617-28a7-4262-a110-f1c71763ad19","Type":"ContainerStarted","Data":"d767e789b4befb7b8caac693075691222c00bb6ae1189417345706dad41621f9"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.440733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-gbrww" event={"ID":"be888039-f158-4d05-9f7d-6d01b2478b08","Type":"ContainerDied","Data":"56740a6f5142d64002a343f18ba92dbb512d9b8a106c7a40a6e41b349e43508e"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.441270 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56740a6f5142d64002a343f18ba92dbb512d9b8a106c7a40a6e41b349e43508e" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.441410 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-gbrww" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.442955 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtnzx\" (UniqueName: \"kubernetes.io/projected/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-kube-api-access-dtnzx\") pod \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.447496 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-operator-scripts\") pod \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.447659 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be888039-f158-4d05-9f7d-6d01b2478b08-operator-scripts\") pod \"be888039-f158-4d05-9f7d-6d01b2478b08\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.447732 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv946\" (UniqueName: \"kubernetes.io/projected/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-kube-api-access-hv946\") pod \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\" (UID: \"7907cc16-7665-49d3-ad17-f9e6e0fc2f09\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.447809 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvqvj\" (UniqueName: \"kubernetes.io/projected/be888039-f158-4d05-9f7d-6d01b2478b08-kube-api-access-pvqvj\") pod \"be888039-f158-4d05-9f7d-6d01b2478b08\" (UID: \"be888039-f158-4d05-9f7d-6d01b2478b08\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.447842 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-operator-scripts\") pod \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\" (UID: \"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.451891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be888039-f158-4d05-9f7d-6d01b2478b08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be888039-f158-4d05-9f7d-6d01b2478b08" (UID: "be888039-f158-4d05-9f7d-6d01b2478b08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.452021 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" (UID: "69f24c00-a64a-4e82-a125-c0ee3fe8fa8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.452159 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7907cc16-7665-49d3-ad17-f9e6e0fc2f09" (UID: "7907cc16-7665-49d3-ad17-f9e6e0fc2f09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.461228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-kube-api-access-dtnzx" (OuterVolumeSpecName: "kube-api-access-dtnzx") pod "69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" (UID: "69f24c00-a64a-4e82-a125-c0ee3fe8fa8f"). InnerVolumeSpecName "kube-api-access-dtnzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.463447 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-kube-api-access-hv946" (OuterVolumeSpecName: "kube-api-access-hv946") pod "7907cc16-7665-49d3-ad17-f9e6e0fc2f09" (UID: "7907cc16-7665-49d3-ad17-f9e6e0fc2f09"). InnerVolumeSpecName "kube-api-access-hv946". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.470102 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be888039-f158-4d05-9f7d-6d01b2478b08-kube-api-access-pvqvj" (OuterVolumeSpecName: "kube-api-access-pvqvj") pod "be888039-f158-4d05-9f7d-6d01b2478b08" (UID: "be888039-f158-4d05-9f7d-6d01b2478b08"). InnerVolumeSpecName "kube-api-access-pvqvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.477612 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pszgr" event={"ID":"54bbbf38-088b-4e4d-8154-569667fcf9a9","Type":"ContainerDied","Data":"303a25d43e818afc0b1deec34d71e454f848a7bcfa271b0aa6ddd404903c4f4b"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.477663 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="303a25d43e818afc0b1deec34d71e454f848a7bcfa271b0aa6ddd404903c4f4b" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.477674 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pszgr" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.486742 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5104-account-create-update-vp7x7" event={"ID":"69f24c00-a64a-4e82-a125-c0ee3fe8fa8f","Type":"ContainerDied","Data":"9536a6635c6de01fa9cbbeb2b3e4a3db2498f81bff722635d21115dedc7f8ce3"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.486790 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9536a6635c6de01fa9cbbeb2b3e4a3db2498f81bff722635d21115dedc7f8ce3" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.486906 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5104-account-create-update-vp7x7" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.515454 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343","Type":"ContainerStarted","Data":"6481b50eed7f8997cc197c4b50a1b5d1b9aa395b3745aa30ff2d6ee451d23215"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.519514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6w7p7" event={"ID":"7907cc16-7665-49d3-ad17-f9e6e0fc2f09","Type":"ContainerDied","Data":"5abfcf1bdd92bd3b6c78b0f416ccffb023fa1f4cdaf92b953a574b587cc4a6d0"} Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.519561 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5abfcf1bdd92bd3b6c78b0f416ccffb023fa1f4cdaf92b953a574b587cc4a6d0" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.519614 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6w7p7" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.552521 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08d7e14a-70d3-446e-8250-ca1047b5bc4b-operator-scripts\") pod \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.552636 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znsll\" (UniqueName: \"kubernetes.io/projected/564de425-5170-45df-9080-5b02579483ee-kube-api-access-znsll\") pod \"564de425-5170-45df-9080-5b02579483ee\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.552662 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-878xd\" (UniqueName: \"kubernetes.io/projected/08d7e14a-70d3-446e-8250-ca1047b5bc4b-kube-api-access-878xd\") pod \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\" (UID: \"08d7e14a-70d3-446e-8250-ca1047b5bc4b\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.552732 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564de425-5170-45df-9080-5b02579483ee-operator-scripts\") pod \"564de425-5170-45df-9080-5b02579483ee\" (UID: \"564de425-5170-45df-9080-5b02579483ee\") " Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553374 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtnzx\" (UniqueName: \"kubernetes.io/projected/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-kube-api-access-dtnzx\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553400 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553413 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be888039-f158-4d05-9f7d-6d01b2478b08-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553424 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv946\" (UniqueName: \"kubernetes.io/projected/7907cc16-7665-49d3-ad17-f9e6e0fc2f09-kube-api-access-hv946\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553438 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvqvj\" (UniqueName: \"kubernetes.io/projected/be888039-f158-4d05-9f7d-6d01b2478b08-kube-api-access-pvqvj\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553451 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.553960 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08d7e14a-70d3-446e-8250-ca1047b5bc4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08d7e14a-70d3-446e-8250-ca1047b5bc4b" (UID: "08d7e14a-70d3-446e-8250-ca1047b5bc4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.555975 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/564de425-5170-45df-9080-5b02579483ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "564de425-5170-45df-9080-5b02579483ee" (UID: "564de425-5170-45df-9080-5b02579483ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.561854 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/564de425-5170-45df-9080-5b02579483ee-kube-api-access-znsll" (OuterVolumeSpecName: "kube-api-access-znsll") pod "564de425-5170-45df-9080-5b02579483ee" (UID: "564de425-5170-45df-9080-5b02579483ee"). InnerVolumeSpecName "kube-api-access-znsll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.566427 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d7e14a-70d3-446e-8250-ca1047b5bc4b-kube-api-access-878xd" (OuterVolumeSpecName: "kube-api-access-878xd") pod "08d7e14a-70d3-446e-8250-ca1047b5bc4b" (UID: "08d7e14a-70d3-446e-8250-ca1047b5bc4b"). InnerVolumeSpecName "kube-api-access-878xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.651338 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.651313775 podStartE2EDuration="3.651313775s" podCreationTimestamp="2026-01-27 15:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:25.550504004 +0000 UTC m=+1291.531113122" watchObservedRunningTime="2026-01-27 15:28:25.651313775 +0000 UTC m=+1291.631922873" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.655404 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08d7e14a-70d3-446e-8250-ca1047b5bc4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.655442 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znsll\" (UniqueName: \"kubernetes.io/projected/564de425-5170-45df-9080-5b02579483ee-kube-api-access-znsll\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.655456 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-878xd\" (UniqueName: \"kubernetes.io/projected/08d7e14a-70d3-446e-8250-ca1047b5bc4b-kube-api-access-878xd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:25 crc kubenswrapper[4772]: I0127 15:28:25.655467 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/564de425-5170-45df-9080-5b02579483ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.510139 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.529134 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a02b617-28a7-4262-a110-f1c71763ad19","Type":"ContainerStarted","Data":"3114715e24bc63a93ce31ec7ec2cc2fdeaad0a6c7647de22f23d06ac45e3d864"} Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.532354 4772 generic.go:334] "Generic (PLEG): container finished" podID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerID="902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b" exitCode=0 Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.532444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerDied","Data":"902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b"} Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.532459 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.532487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65e74d64-83f8-4964-8950-bf76816dd5fc","Type":"ContainerDied","Data":"efec7f44770a0bed43e7dc53d45c9b414c83392d3729ddafeb636db96612decb"} Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.532531 4772 scope.go:117] "RemoveContainer" containerID="db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.537057 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-12a3-account-create-update-mdv84" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.540733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" event={"ID":"564de425-5170-45df-9080-5b02579483ee","Type":"ContainerDied","Data":"deaf013f2c9b59ae86d1e302aff4c0c47992ea99da1873382d3b8f4f077d78da"} Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.540772 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="deaf013f2c9b59ae86d1e302aff4c0c47992ea99da1873382d3b8f4f077d78da" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.540861 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6af8-account-create-update-ltwnh" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.568295 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.568277939 podStartE2EDuration="4.568277939s" podCreationTimestamp="2026-01-27 15:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:28:26.558192145 +0000 UTC m=+1292.538801243" watchObservedRunningTime="2026-01-27 15:28:26.568277939 +0000 UTC m=+1292.548887037" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.611460 4772 scope.go:117] "RemoveContainer" containerID="2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.639630 4772 scope.go:117] "RemoveContainer" containerID="902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.665404 4772 scope.go:117] "RemoveContainer" containerID="96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.687338 4772 scope.go:117] "RemoveContainer" containerID="db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.688677 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e\": container with ID starting with db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e not found: ID does not exist" containerID="db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.688730 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e"} err="failed to get container status \"db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e\": rpc error: code = NotFound desc = could not find container \"db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e\": container with ID starting with db4f9e747383adbadbac961f6c8e5009d6edc10dcb010e6f2eb4e5637b296b8e not found: ID does not exist" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.688761 4772 scope.go:117] "RemoveContainer" containerID="2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.689375 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a\": container with ID starting with 2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a not found: ID does not exist" containerID="2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.689406 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a"} err="failed to get container status \"2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a\": rpc error: code = NotFound desc = could not find container \"2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a\": container with ID starting with 2d2ef8b64e19f03ac931485698899fb937f36aead7d1925135401934fbecd74a not found: ID does not exist" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.689428 4772 scope.go:117] "RemoveContainer" containerID="902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.689661 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b\": container with ID starting with 902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b not found: ID does not exist" containerID="902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.689686 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b"} err="failed to get container status \"902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b\": rpc error: code = NotFound desc = could not find container \"902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b\": container with ID starting with 902d06fc4d38a39dd40f6481a4374dc4c1e7ef0957c3355454b84d0778d1bc2b not found: ID does not exist" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.689699 4772 scope.go:117] "RemoveContainer" containerID="96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.690733 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96\": container with ID starting with 96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96 not found: ID does not exist" containerID="96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.690754 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96"} err="failed to get container status \"96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96\": rpc error: code = NotFound desc = could not find container \"96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96\": container with ID starting with 96c51a2493cf3f280c96b0bf8545deb92d6624f539170b50f2d2a50880a71c96 not found: ID does not exist" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702461 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-sg-core-conf-yaml\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702573 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-config-data\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702640 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-log-httpd\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702750 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-run-httpd\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702772 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhrl9\" (UniqueName: \"kubernetes.io/projected/65e74d64-83f8-4964-8950-bf76816dd5fc-kube-api-access-nhrl9\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702820 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-combined-ca-bundle\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.702850 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-scripts\") pod \"65e74d64-83f8-4964-8950-bf76816dd5fc\" (UID: \"65e74d64-83f8-4964-8950-bf76816dd5fc\") " Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.705516 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.706029 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.712253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-scripts" (OuterVolumeSpecName: "scripts") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.727494 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e74d64-83f8-4964-8950-bf76816dd5fc-kube-api-access-nhrl9" (OuterVolumeSpecName: "kube-api-access-nhrl9") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "kube-api-access-nhrl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.756406 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.785357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.804960 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.805115 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.805237 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.805322 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.805411 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65e74d64-83f8-4964-8950-bf76816dd5fc-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.805483 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhrl9\" (UniqueName: \"kubernetes.io/projected/65e74d64-83f8-4964-8950-bf76816dd5fc-kube-api-access-nhrl9\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.821158 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-config-data" (OuterVolumeSpecName: "config-data") pod "65e74d64-83f8-4964-8950-bf76816dd5fc" (UID: "65e74d64-83f8-4964-8950-bf76816dd5fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.866797 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.876577 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.895689 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896057 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-central-agent" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896074 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-central-agent" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896088 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="564de425-5170-45df-9080-5b02579483ee" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896094 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="564de425-5170-45df-9080-5b02579483ee" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896107 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896113 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896121 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7907cc16-7665-49d3-ad17-f9e6e0fc2f09" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896127 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7907cc16-7665-49d3-ad17-f9e6e0fc2f09" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896142 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="proxy-httpd" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896148 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="proxy-httpd" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896157 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be888039-f158-4d05-9f7d-6d01b2478b08" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896200 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="be888039-f158-4d05-9f7d-6d01b2478b08" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896218 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bbbf38-088b-4e4d-8154-569667fcf9a9" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896225 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bbbf38-088b-4e4d-8154-569667fcf9a9" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896240 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d7e14a-70d3-446e-8250-ca1047b5bc4b" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896247 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d7e14a-70d3-446e-8250-ca1047b5bc4b" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896258 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="sg-core" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896265 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="sg-core" Jan 27 15:28:26 crc kubenswrapper[4772]: E0127 15:28:26.896281 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-notification-agent" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896288 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-notification-agent" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896466 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="564de425-5170-45df-9080-5b02579483ee" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896485 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="be888039-f158-4d05-9f7d-6d01b2478b08" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896494 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-central-agent" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896503 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bbbf38-088b-4e4d-8154-569667fcf9a9" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896513 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="proxy-httpd" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896520 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="ceilometer-notification-agent" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896529 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896540 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7907cc16-7665-49d3-ad17-f9e6e0fc2f09" containerName="mariadb-database-create" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896551 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d7e14a-70d3-446e-8250-ca1047b5bc4b" containerName="mariadb-account-create-update" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.896562 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" containerName="sg-core" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.898091 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.905493 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.905672 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.907649 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e74d64-83f8-4964-8950-bf76816dd5fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:26 crc kubenswrapper[4772]: I0127 15:28:26.913793 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.008775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.008824 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-run-httpd\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.008871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.008985 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-scripts\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.009019 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-config-data\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.009159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxklb\" (UniqueName: \"kubernetes.io/projected/de3b39d5-b15f-46c0-881a-e747e07e76a5-kube-api-access-xxklb\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.009490 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-log-httpd\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.110899 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-log-httpd\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.110946 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.110969 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-run-httpd\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.111006 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.111035 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-scripts\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.111483 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-log-httpd\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.111700 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-run-httpd\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.111752 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-config-data\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.111788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxklb\" (UniqueName: \"kubernetes.io/projected/de3b39d5-b15f-46c0-881a-e747e07e76a5-kube-api-access-xxklb\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.117363 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.118650 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-scripts\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.119468 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-config-data\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.120039 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.138314 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxklb\" (UniqueName: \"kubernetes.io/projected/de3b39d5-b15f-46c0-881a-e747e07e76a5-kube-api-access-xxklb\") pod \"ceilometer-0\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.258850 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:27 crc kubenswrapper[4772]: I0127 15:28:27.754978 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:27 crc kubenswrapper[4772]: W0127 15:28:27.757713 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde3b39d5_b15f_46c0_881a_e747e07e76a5.slice/crio-fe477ed2e6bff622d3bb919bb2a03b5372b4eb037875880ee816351cacaad478 WatchSource:0}: Error finding container fe477ed2e6bff622d3bb919bb2a03b5372b4eb037875880ee816351cacaad478: Status 404 returned error can't find the container with id fe477ed2e6bff622d3bb919bb2a03b5372b4eb037875880ee816351cacaad478 Jan 27 15:28:28 crc kubenswrapper[4772]: I0127 15:28:28.576031 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerStarted","Data":"4c6da56f01306accbad60e3ba02a91f4cc6ed8bb905bd9286671fd7f32153ed5"} Jan 27 15:28:28 crc kubenswrapper[4772]: I0127 15:28:28.576403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerStarted","Data":"fe477ed2e6bff622d3bb919bb2a03b5372b4eb037875880ee816351cacaad478"} Jan 27 15:28:28 crc kubenswrapper[4772]: I0127 15:28:28.677225 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e74d64-83f8-4964-8950-bf76816dd5fc" path="/var/lib/kubelet/pods/65e74d64-83f8-4964-8950-bf76816dd5fc/volumes" Jan 27 15:28:28 crc kubenswrapper[4772]: I0127 15:28:28.764713 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:28:29 crc kubenswrapper[4772]: I0127 15:28:29.589441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerStarted","Data":"48911a4a107b6bf266b45bdb20df360ce0efcf35791daa4bc1413cb966d28fb0"} Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.126966 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.394299 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-v9mqp"] Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.395514 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.400930 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.401324 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.401463 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bq7vb" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.405144 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-v9mqp"] Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.480519 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz7jw\" (UniqueName: \"kubernetes.io/projected/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-kube-api-access-pz7jw\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.480602 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-scripts\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.480778 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-config-data\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.480830 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.583043 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-config-data\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.583414 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.583596 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz7jw\" (UniqueName: \"kubernetes.io/projected/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-kube-api-access-pz7jw\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.584059 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-scripts\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.588704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.595691 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-scripts\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.596316 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-config-data\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.600678 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerStarted","Data":"9627fca4ce2bbd20c54de88fa2250d98bc1976636644d325a8225826fd2e9ef2"} Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.603414 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz7jw\" (UniqueName: \"kubernetes.io/projected/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-kube-api-access-pz7jw\") pod \"nova-cell0-conductor-db-sync-v9mqp\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:30 crc kubenswrapper[4772]: I0127 15:28:30.717492 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.277704 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-v9mqp"] Jan 27 15:28:31 crc kubenswrapper[4772]: W0127 15:28:31.279155 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe34fbf1_61c4_46a9_9954_64ed431d2cb7.slice/crio-e097abc13b5b9498583b3db2e2e89e5740a2e06f350bc375a47375cb723458af WatchSource:0}: Error finding container e097abc13b5b9498583b3db2e2e89e5740a2e06f350bc375a47375cb723458af: Status 404 returned error can't find the container with id e097abc13b5b9498583b3db2e2e89e5740a2e06f350bc375a47375cb723458af Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.613908 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerStarted","Data":"dce84557790ce392eba68b822eea435ede1d05fd9a392c9bd393123a9c7bf467"} Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.614017 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-central-agent" containerID="cri-o://4c6da56f01306accbad60e3ba02a91f4cc6ed8bb905bd9286671fd7f32153ed5" gracePeriod=30 Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.614041 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.614079 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="proxy-httpd" containerID="cri-o://dce84557790ce392eba68b822eea435ede1d05fd9a392c9bd393123a9c7bf467" gracePeriod=30 Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.614105 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="sg-core" containerID="cri-o://9627fca4ce2bbd20c54de88fa2250d98bc1976636644d325a8225826fd2e9ef2" gracePeriod=30 Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.614115 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-notification-agent" containerID="cri-o://48911a4a107b6bf266b45bdb20df360ce0efcf35791daa4bc1413cb966d28fb0" gracePeriod=30 Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.617335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" event={"ID":"fe34fbf1-61c4-46a9-9954-64ed431d2cb7","Type":"ContainerStarted","Data":"e097abc13b5b9498583b3db2e2e89e5740a2e06f350bc375a47375cb723458af"} Jan 27 15:28:31 crc kubenswrapper[4772]: I0127 15:28:31.644885 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.392901135 podStartE2EDuration="5.644866735s" podCreationTimestamp="2026-01-27 15:28:26 +0000 UTC" firstStartedPulling="2026-01-27 15:28:27.760403653 +0000 UTC m=+1293.741012751" lastFinishedPulling="2026-01-27 15:28:31.012369253 +0000 UTC m=+1296.992978351" observedRunningTime="2026-01-27 15:28:31.636758109 +0000 UTC m=+1297.617367207" watchObservedRunningTime="2026-01-27 15:28:31.644866735 +0000 UTC m=+1297.625475833" Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.104263 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.168645 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66bf894476-wz7b5"] Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.168868 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66bf894476-wz7b5" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-api" containerID="cri-o://8859f4bb50887ba9951c0e2249a3e56deff79409c3a080683519e71c92360a6d" gracePeriod=30 Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.169406 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66bf894476-wz7b5" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-httpd" containerID="cri-o://b5a8f7019a8ae14ffdea4c25f43d7ff45e4469316acbf03b2364b347f5933e7c" gracePeriod=30 Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.601781 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.601865 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.644145 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.645561 4772 generic.go:334] "Generic (PLEG): container finished" podID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerID="b5a8f7019a8ae14ffdea4c25f43d7ff45e4469316acbf03b2364b347f5933e7c" exitCode=0 Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.646030 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bf894476-wz7b5" event={"ID":"e7385520-8ffb-40e5-802e-ff0db348c5c1","Type":"ContainerDied","Data":"b5a8f7019a8ae14ffdea4c25f43d7ff45e4469316acbf03b2364b347f5933e7c"} Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.653405 4772 generic.go:334] "Generic (PLEG): container finished" podID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerID="dce84557790ce392eba68b822eea435ede1d05fd9a392c9bd393123a9c7bf467" exitCode=0 Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.653446 4772 generic.go:334] "Generic (PLEG): container finished" podID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerID="9627fca4ce2bbd20c54de88fa2250d98bc1976636644d325a8225826fd2e9ef2" exitCode=2 Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.653457 4772 generic.go:334] "Generic (PLEG): container finished" podID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerID="48911a4a107b6bf266b45bdb20df360ce0efcf35791daa4bc1413cb966d28fb0" exitCode=0 Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.653522 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerDied","Data":"dce84557790ce392eba68b822eea435ede1d05fd9a392c9bd393123a9c7bf467"} Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.653559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerDied","Data":"9627fca4ce2bbd20c54de88fa2250d98bc1976636644d325a8225826fd2e9ef2"} Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.653574 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerDied","Data":"48911a4a107b6bf266b45bdb20df360ce0efcf35791daa4bc1413cb966d28fb0"} Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.654731 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:32 crc kubenswrapper[4772]: I0127 15:28:32.656286 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.109620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.109667 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.157101 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.159024 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.662060 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.662093 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 15:28:33 crc kubenswrapper[4772]: I0127 15:28:33.662103 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:34 crc kubenswrapper[4772]: I0127 15:28:34.674509 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:28:34 crc kubenswrapper[4772]: I0127 15:28:34.986394 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:34 crc kubenswrapper[4772]: I0127 15:28:34.990061 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.209775 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.210109 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.519755 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.704315 4772 generic.go:334] "Generic (PLEG): container finished" podID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerID="8859f4bb50887ba9951c0e2249a3e56deff79409c3a080683519e71c92360a6d" exitCode=0 Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.704366 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bf894476-wz7b5" event={"ID":"e7385520-8ffb-40e5-802e-ff0db348c5c1","Type":"ContainerDied","Data":"8859f4bb50887ba9951c0e2249a3e56deff79409c3a080683519e71c92360a6d"} Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.708536 4772 generic.go:334] "Generic (PLEG): container finished" podID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerID="4c6da56f01306accbad60e3ba02a91f4cc6ed8bb905bd9286671fd7f32153ed5" exitCode=0 Jan 27 15:28:36 crc kubenswrapper[4772]: I0127 15:28:36.709472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerDied","Data":"4c6da56f01306accbad60e3ba02a91f4cc6ed8bb905bd9286671fd7f32153ed5"} Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.749003 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de3b39d5-b15f-46c0-881a-e747e07e76a5","Type":"ContainerDied","Data":"fe477ed2e6bff622d3bb919bb2a03b5372b4eb037875880ee816351cacaad478"} Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.749547 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe477ed2e6bff622d3bb919bb2a03b5372b4eb037875880ee816351cacaad478" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.785544 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.893932 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-run-httpd\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.894232 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxklb\" (UniqueName: \"kubernetes.io/projected/de3b39d5-b15f-46c0-881a-e747e07e76a5-kube-api-access-xxklb\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.894268 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-sg-core-conf-yaml\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.894354 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-config-data\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.894441 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-log-httpd\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.894464 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-combined-ca-bundle\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.894535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-scripts\") pod \"de3b39d5-b15f-46c0-881a-e747e07e76a5\" (UID: \"de3b39d5-b15f-46c0-881a-e747e07e76a5\") " Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.896401 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.896464 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.903979 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3b39d5-b15f-46c0-881a-e747e07e76a5-kube-api-access-xxklb" (OuterVolumeSpecName: "kube-api-access-xxklb") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "kube-api-access-xxklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.904750 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.904786 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de3b39d5-b15f-46c0-881a-e747e07e76a5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.904801 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxklb\" (UniqueName: \"kubernetes.io/projected/de3b39d5-b15f-46c0-881a-e747e07e76a5-kube-api-access-xxklb\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.908055 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-scripts" (OuterVolumeSpecName: "scripts") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.937144 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:40 crc kubenswrapper[4772]: I0127 15:28:40.955728 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.005253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.006359 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.006378 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.006388 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.024466 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-config-data" (OuterVolumeSpecName: "config-data") pod "de3b39d5-b15f-46c0-881a-e747e07e76a5" (UID: "de3b39d5-b15f-46c0-881a-e747e07e76a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.107160 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-config\") pod \"e7385520-8ffb-40e5-802e-ff0db348c5c1\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.107262 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb8pb\" (UniqueName: \"kubernetes.io/projected/e7385520-8ffb-40e5-802e-ff0db348c5c1-kube-api-access-bb8pb\") pod \"e7385520-8ffb-40e5-802e-ff0db348c5c1\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.107301 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-combined-ca-bundle\") pod \"e7385520-8ffb-40e5-802e-ff0db348c5c1\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.107342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-ovndb-tls-certs\") pod \"e7385520-8ffb-40e5-802e-ff0db348c5c1\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.107442 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-httpd-config\") pod \"e7385520-8ffb-40e5-802e-ff0db348c5c1\" (UID: \"e7385520-8ffb-40e5-802e-ff0db348c5c1\") " Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.107834 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de3b39d5-b15f-46c0-881a-e747e07e76a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.111723 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7385520-8ffb-40e5-802e-ff0db348c5c1-kube-api-access-bb8pb" (OuterVolumeSpecName: "kube-api-access-bb8pb") pod "e7385520-8ffb-40e5-802e-ff0db348c5c1" (UID: "e7385520-8ffb-40e5-802e-ff0db348c5c1"). InnerVolumeSpecName "kube-api-access-bb8pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.112741 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e7385520-8ffb-40e5-802e-ff0db348c5c1" (UID: "e7385520-8ffb-40e5-802e-ff0db348c5c1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.160514 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7385520-8ffb-40e5-802e-ff0db348c5c1" (UID: "e7385520-8ffb-40e5-802e-ff0db348c5c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.175746 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e7385520-8ffb-40e5-802e-ff0db348c5c1" (UID: "e7385520-8ffb-40e5-802e-ff0db348c5c1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.184038 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-config" (OuterVolumeSpecName: "config") pod "e7385520-8ffb-40e5-802e-ff0db348c5c1" (UID: "e7385520-8ffb-40e5-802e-ff0db348c5c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.209638 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.209667 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.209678 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb8pb\" (UniqueName: \"kubernetes.io/projected/e7385520-8ffb-40e5-802e-ff0db348c5c1-kube-api-access-bb8pb\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.209687 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.209695 4772 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7385520-8ffb-40e5-802e-ff0db348c5c1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.759491 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66bf894476-wz7b5" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.759489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66bf894476-wz7b5" event={"ID":"e7385520-8ffb-40e5-802e-ff0db348c5c1","Type":"ContainerDied","Data":"d30df4d73e5cfb24af9149a2561d9917bae2965d7778272568f0bfb1966f855f"} Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.759895 4772 scope.go:117] "RemoveContainer" containerID="b5a8f7019a8ae14ffdea4c25f43d7ff45e4469316acbf03b2364b347f5933e7c" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.762512 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.764923 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" event={"ID":"fe34fbf1-61c4-46a9-9954-64ed431d2cb7","Type":"ContainerStarted","Data":"35964dfe2e497930630aeb0996d17bf7bbe0e9d5e7bfb1d7efca05167ac578fc"} Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.788504 4772 scope.go:117] "RemoveContainer" containerID="8859f4bb50887ba9951c0e2249a3e56deff79409c3a080683519e71c92360a6d" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.788936 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" podStartSLOduration=2.444191847 podStartE2EDuration="11.788913933s" podCreationTimestamp="2026-01-27 15:28:30 +0000 UTC" firstStartedPulling="2026-01-27 15:28:31.285684411 +0000 UTC m=+1297.266293509" lastFinishedPulling="2026-01-27 15:28:40.630406497 +0000 UTC m=+1306.611015595" observedRunningTime="2026-01-27 15:28:41.78019221 +0000 UTC m=+1307.760801298" watchObservedRunningTime="2026-01-27 15:28:41.788913933 +0000 UTC m=+1307.769523031" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.807254 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.817600 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.833033 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66bf894476-wz7b5"] Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.853891 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66bf894476-wz7b5"] Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.870614 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:41 crc kubenswrapper[4772]: E0127 15:28:41.871136 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-httpd" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871155 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-httpd" Jan 27 15:28:41 crc kubenswrapper[4772]: E0127 15:28:41.871204 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="sg-core" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871213 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="sg-core" Jan 27 15:28:41 crc kubenswrapper[4772]: E0127 15:28:41.871240 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="proxy-httpd" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871247 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="proxy-httpd" Jan 27 15:28:41 crc kubenswrapper[4772]: E0127 15:28:41.871285 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-central-agent" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871335 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-central-agent" Jan 27 15:28:41 crc kubenswrapper[4772]: E0127 15:28:41.871358 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-notification-agent" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871366 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-notification-agent" Jan 27 15:28:41 crc kubenswrapper[4772]: E0127 15:28:41.871382 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-api" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871389 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-api" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871597 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-api" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871613 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="proxy-httpd" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871633 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" containerName="neutron-httpd" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871644 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-notification-agent" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871658 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="ceilometer-central-agent" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.871676 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" containerName="sg-core" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.874052 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.879078 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.881932 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:28:41 crc kubenswrapper[4772]: I0127 15:28:41.882196 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.046247 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.046929 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-scripts\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.047080 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5526\" (UniqueName: \"kubernetes.io/projected/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-kube-api-access-p5526\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.047247 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-config-data\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.047359 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-run-httpd\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.047586 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.047710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-log-httpd\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.058627 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.058694 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149111 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-scripts\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5526\" (UniqueName: \"kubernetes.io/projected/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-kube-api-access-p5526\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149687 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-config-data\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149719 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-run-httpd\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149826 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-log-httpd\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.149900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.150922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-run-httpd\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.151644 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-log-httpd\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.156587 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.157602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-config-data\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.158724 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-scripts\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.160530 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.176651 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5526\" (UniqueName: \"kubernetes.io/projected/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-kube-api-access-p5526\") pod \"ceilometer-0\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.201619 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.685273 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3b39d5-b15f-46c0-881a-e747e07e76a5" path="/var/lib/kubelet/pods/de3b39d5-b15f-46c0-881a-e747e07e76a5/volumes" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.687471 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7385520-8ffb-40e5-802e-ff0db348c5c1" path="/var/lib/kubelet/pods/e7385520-8ffb-40e5-802e-ff0db348c5c1/volumes" Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.758000 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.792215 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerStarted","Data":"b742511aad6534f52532262eb0cfdd05571cd1513b70b4dc28a5263607304430"} Jan 27 15:28:42 crc kubenswrapper[4772]: I0127 15:28:42.966291 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:43 crc kubenswrapper[4772]: I0127 15:28:43.802355 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerStarted","Data":"afcf06fa22d2533b1f1a226452ae6ceefe63b8b90f23a95e55e5536a352c31c5"} Jan 27 15:28:44 crc kubenswrapper[4772]: I0127 15:28:44.812517 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerStarted","Data":"01b8e4f8a171a9643b0341141cead865a87cb972ae5d54e123bbcc5bbb627212"} Jan 27 15:28:45 crc kubenswrapper[4772]: I0127 15:28:45.849418 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerStarted","Data":"aeaac2f858ccbdd7513c3cf040b3290daac6472fdc5c899f1438b9ebc94bf571"} Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.870500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerStarted","Data":"a9e4d8d8cdfe57821f29d14d1ac46a5f3e7ed0b5e31d0de7bea6c91615349a23"} Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.871282 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-central-agent" containerID="cri-o://afcf06fa22d2533b1f1a226452ae6ceefe63b8b90f23a95e55e5536a352c31c5" gracePeriod=30 Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.871600 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.871948 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="proxy-httpd" containerID="cri-o://a9e4d8d8cdfe57821f29d14d1ac46a5f3e7ed0b5e31d0de7bea6c91615349a23" gracePeriod=30 Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.872008 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="sg-core" containerID="cri-o://aeaac2f858ccbdd7513c3cf040b3290daac6472fdc5c899f1438b9ebc94bf571" gracePeriod=30 Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.872051 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-notification-agent" containerID="cri-o://01b8e4f8a171a9643b0341141cead865a87cb972ae5d54e123bbcc5bbb627212" gracePeriod=30 Jan 27 15:28:47 crc kubenswrapper[4772]: I0127 15:28:47.906506 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.876167947 podStartE2EDuration="6.906479221s" podCreationTimestamp="2026-01-27 15:28:41 +0000 UTC" firstStartedPulling="2026-01-27 15:28:42.773758801 +0000 UTC m=+1308.754367909" lastFinishedPulling="2026-01-27 15:28:46.804070085 +0000 UTC m=+1312.784679183" observedRunningTime="2026-01-27 15:28:47.895963975 +0000 UTC m=+1313.876573083" watchObservedRunningTime="2026-01-27 15:28:47.906479221 +0000 UTC m=+1313.887088319" Jan 27 15:28:48 crc kubenswrapper[4772]: I0127 15:28:48.882920 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerID="a9e4d8d8cdfe57821f29d14d1ac46a5f3e7ed0b5e31d0de7bea6c91615349a23" exitCode=0 Jan 27 15:28:48 crc kubenswrapper[4772]: I0127 15:28:48.882961 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerID="aeaac2f858ccbdd7513c3cf040b3290daac6472fdc5c899f1438b9ebc94bf571" exitCode=2 Jan 27 15:28:48 crc kubenswrapper[4772]: I0127 15:28:48.882975 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerID="01b8e4f8a171a9643b0341141cead865a87cb972ae5d54e123bbcc5bbb627212" exitCode=0 Jan 27 15:28:48 crc kubenswrapper[4772]: I0127 15:28:48.882985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerDied","Data":"a9e4d8d8cdfe57821f29d14d1ac46a5f3e7ed0b5e31d0de7bea6c91615349a23"} Jan 27 15:28:48 crc kubenswrapper[4772]: I0127 15:28:48.883036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerDied","Data":"aeaac2f858ccbdd7513c3cf040b3290daac6472fdc5c899f1438b9ebc94bf571"} Jan 27 15:28:48 crc kubenswrapper[4772]: I0127 15:28:48.883048 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerDied","Data":"01b8e4f8a171a9643b0341141cead865a87cb972ae5d54e123bbcc5bbb627212"} Jan 27 15:28:50 crc kubenswrapper[4772]: I0127 15:28:50.768067 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 15:28:50 crc kubenswrapper[4772]: I0127 15:28:50.768182 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c94a7cfa-28e2-4d52-85a1-d5586f162227" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 15:28:53 crc kubenswrapper[4772]: I0127 15:28:53.943399 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerID="afcf06fa22d2533b1f1a226452ae6ceefe63b8b90f23a95e55e5536a352c31c5" exitCode=0 Jan 27 15:28:53 crc kubenswrapper[4772]: I0127 15:28:53.943551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerDied","Data":"afcf06fa22d2533b1f1a226452ae6ceefe63b8b90f23a95e55e5536a352c31c5"} Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.236641 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.357506 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-run-httpd\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.357802 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-combined-ca-bundle\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.357826 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5526\" (UniqueName: \"kubernetes.io/projected/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-kube-api-access-p5526\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.357855 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-config-data\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.357956 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-sg-core-conf-yaml\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.358046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-scripts\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.358083 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-log-httpd\") pod \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\" (UID: \"b4ce266c-6d03-4c51-a8e7-2439eecdf67d\") " Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.359020 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.359272 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.365343 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-kube-api-access-p5526" (OuterVolumeSpecName: "kube-api-access-p5526") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "kube-api-access-p5526". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.365344 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-scripts" (OuterVolumeSpecName: "scripts") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.388098 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.440316 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.460226 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.460337 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5526\" (UniqueName: \"kubernetes.io/projected/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-kube-api-access-p5526\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.460353 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.460364 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.460375 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.460389 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.478358 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-config-data" (OuterVolumeSpecName: "config-data") pod "b4ce266c-6d03-4c51-a8e7-2439eecdf67d" (UID: "b4ce266c-6d03-4c51-a8e7-2439eecdf67d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.562266 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4ce266c-6d03-4c51-a8e7-2439eecdf67d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.958547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4ce266c-6d03-4c51-a8e7-2439eecdf67d","Type":"ContainerDied","Data":"b742511aad6534f52532262eb0cfdd05571cd1513b70b4dc28a5263607304430"} Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.958600 4772 scope.go:117] "RemoveContainer" containerID="a9e4d8d8cdfe57821f29d14d1ac46a5f3e7ed0b5e31d0de7bea6c91615349a23" Jan 27 15:28:54 crc kubenswrapper[4772]: I0127 15:28:54.958732 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:54.993094 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.015247 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.029468 4772 scope.go:117] "RemoveContainer" containerID="aeaac2f858ccbdd7513c3cf040b3290daac6472fdc5c899f1438b9ebc94bf571" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.042227 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:55 crc kubenswrapper[4772]: E0127 15:28:55.042754 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-central-agent" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.042779 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-central-agent" Jan 27 15:28:55 crc kubenswrapper[4772]: E0127 15:28:55.042806 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-notification-agent" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.042814 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-notification-agent" Jan 27 15:28:55 crc kubenswrapper[4772]: E0127 15:28:55.042827 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="sg-core" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.042837 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="sg-core" Jan 27 15:28:55 crc kubenswrapper[4772]: E0127 15:28:55.042858 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="proxy-httpd" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.042865 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="proxy-httpd" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.043060 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="proxy-httpd" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.043078 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-notification-agent" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.043108 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="sg-core" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.043121 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" containerName="ceilometer-central-agent" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.045121 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.050032 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.050864 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.070208 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.122219 4772 scope.go:117] "RemoveContainer" containerID="01b8e4f8a171a9643b0341141cead865a87cb972ae5d54e123bbcc5bbb627212" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200086 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-run-httpd\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-scripts\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200216 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-log-httpd\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200274 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-config-data\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzcc\" (UniqueName: \"kubernetes.io/projected/2869a695-9773-4816-90d1-34f45555b442-kube-api-access-tnzcc\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200382 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.200428 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.212440 4772 scope.go:117] "RemoveContainer" containerID="afcf06fa22d2533b1f1a226452ae6ceefe63b8b90f23a95e55e5536a352c31c5" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.301514 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzcc\" (UniqueName: \"kubernetes.io/projected/2869a695-9773-4816-90d1-34f45555b442-kube-api-access-tnzcc\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.301799 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.301841 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.301902 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-run-httpd\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.301938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-scripts\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.301959 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-log-httpd\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.302505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-log-httpd\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.302750 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-run-httpd\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.303428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-config-data\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.306739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.307350 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-config-data\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.311855 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-scripts\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.312423 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.318401 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzcc\" (UniqueName: \"kubernetes.io/projected/2869a695-9773-4816-90d1-34f45555b442-kube-api-access-tnzcc\") pod \"ceilometer-0\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.433449 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.878825 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.968223 4772 generic.go:334] "Generic (PLEG): container finished" podID="fe34fbf1-61c4-46a9-9954-64ed431d2cb7" containerID="35964dfe2e497930630aeb0996d17bf7bbe0e9d5e7bfb1d7efca05167ac578fc" exitCode=0 Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.968297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" event={"ID":"fe34fbf1-61c4-46a9-9954-64ed431d2cb7","Type":"ContainerDied","Data":"35964dfe2e497930630aeb0996d17bf7bbe0e9d5e7bfb1d7efca05167ac578fc"} Jan 27 15:28:55 crc kubenswrapper[4772]: I0127 15:28:55.970781 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerStarted","Data":"6e415f02d11a2c8f15d05a7114c5f8606abeaaf34e280ab6666002c1dea01ba4"} Jan 27 15:28:56 crc kubenswrapper[4772]: I0127 15:28:56.679311 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ce266c-6d03-4c51-a8e7-2439eecdf67d" path="/var/lib/kubelet/pods/b4ce266c-6d03-4c51-a8e7-2439eecdf67d/volumes" Jan 27 15:28:56 crc kubenswrapper[4772]: I0127 15:28:56.982073 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerStarted","Data":"ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310"} Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.329305 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.435033 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-config-data\") pod \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.435148 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-combined-ca-bundle\") pod \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.435213 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-scripts\") pod \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.435307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz7jw\" (UniqueName: \"kubernetes.io/projected/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-kube-api-access-pz7jw\") pod \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\" (UID: \"fe34fbf1-61c4-46a9-9954-64ed431d2cb7\") " Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.453359 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-kube-api-access-pz7jw" (OuterVolumeSpecName: "kube-api-access-pz7jw") pod "fe34fbf1-61c4-46a9-9954-64ed431d2cb7" (UID: "fe34fbf1-61c4-46a9-9954-64ed431d2cb7"). InnerVolumeSpecName "kube-api-access-pz7jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.455815 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-scripts" (OuterVolumeSpecName: "scripts") pod "fe34fbf1-61c4-46a9-9954-64ed431d2cb7" (UID: "fe34fbf1-61c4-46a9-9954-64ed431d2cb7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.464618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-config-data" (OuterVolumeSpecName: "config-data") pod "fe34fbf1-61c4-46a9-9954-64ed431d2cb7" (UID: "fe34fbf1-61c4-46a9-9954-64ed431d2cb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.466247 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe34fbf1-61c4-46a9-9954-64ed431d2cb7" (UID: "fe34fbf1-61c4-46a9-9954-64ed431d2cb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.537507 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.537558 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.537577 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.537590 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz7jw\" (UniqueName: \"kubernetes.io/projected/fe34fbf1-61c4-46a9-9954-64ed431d2cb7-kube-api-access-pz7jw\") on node \"crc\" DevicePath \"\"" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.994995 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerStarted","Data":"98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770"} Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.996948 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" event={"ID":"fe34fbf1-61c4-46a9-9954-64ed431d2cb7","Type":"ContainerDied","Data":"e097abc13b5b9498583b3db2e2e89e5740a2e06f350bc375a47375cb723458af"} Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.997001 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e097abc13b5b9498583b3db2e2e89e5740a2e06f350bc375a47375cb723458af" Jan 27 15:28:57 crc kubenswrapper[4772]: I0127 15:28:57.997014 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-v9mqp" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.099875 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:28:58 crc kubenswrapper[4772]: E0127 15:28:58.100271 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe34fbf1-61c4-46a9-9954-64ed431d2cb7" containerName="nova-cell0-conductor-db-sync" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.100291 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe34fbf1-61c4-46a9-9954-64ed431d2cb7" containerName="nova-cell0-conductor-db-sync" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.100531 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe34fbf1-61c4-46a9-9954-64ed431d2cb7" containerName="nova-cell0-conductor-db-sync" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.101180 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.105789 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.105850 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bq7vb" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.112730 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.250000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.250060 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.250091 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txnx2\" (UniqueName: \"kubernetes.io/projected/bd9ac534-7732-417d-81a3-573fe821b26d-kube-api-access-txnx2\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.352699 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.352978 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.352998 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txnx2\" (UniqueName: \"kubernetes.io/projected/bd9ac534-7732-417d-81a3-573fe821b26d-kube-api-access-txnx2\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.359009 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.372269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.376679 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txnx2\" (UniqueName: \"kubernetes.io/projected/bd9ac534-7732-417d-81a3-573fe821b26d-kube-api-access-txnx2\") pod \"nova-cell0-conductor-0\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.464228 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.574090 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:28:58 crc kubenswrapper[4772]: W0127 15:28:58.974673 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd9ac534_7732_417d_81a3_573fe821b26d.slice/crio-1fa0fda721794873849780c939c36b0ae92fda6799dd0ff519efa3b4fb4008b6 WatchSource:0}: Error finding container 1fa0fda721794873849780c939c36b0ae92fda6799dd0ff519efa3b4fb4008b6: Status 404 returned error can't find the container with id 1fa0fda721794873849780c939c36b0ae92fda6799dd0ff519efa3b4fb4008b6 Jan 27 15:28:58 crc kubenswrapper[4772]: I0127 15:28:58.983154 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:28:59 crc kubenswrapper[4772]: I0127 15:28:59.014045 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:28:59 crc kubenswrapper[4772]: I0127 15:28:59.043788 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerStarted","Data":"ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9"} Jan 27 15:28:59 crc kubenswrapper[4772]: I0127 15:28:59.059807 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bd9ac534-7732-417d-81a3-573fe821b26d","Type":"ContainerStarted","Data":"1fa0fda721794873849780c939c36b0ae92fda6799dd0ff519efa3b4fb4008b6"} Jan 27 15:29:00 crc kubenswrapper[4772]: I0127 15:29:00.071856 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bd9ac534-7732-417d-81a3-573fe821b26d","Type":"ContainerStarted","Data":"d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7"} Jan 27 15:29:00 crc kubenswrapper[4772]: I0127 15:29:00.072352 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:00 crc kubenswrapper[4772]: I0127 15:29:00.071978 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" gracePeriod=30 Jan 27 15:29:00 crc kubenswrapper[4772]: I0127 15:29:00.094899 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.094878454 podStartE2EDuration="2.094878454s" podCreationTimestamp="2026-01-27 15:28:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:29:00.089212709 +0000 UTC m=+1326.069821817" watchObservedRunningTime="2026-01-27 15:29:00.094878454 +0000 UTC m=+1326.075487552" Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.083115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerStarted","Data":"dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1"} Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.083317 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-central-agent" containerID="cri-o://ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310" gracePeriod=30 Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.083370 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="sg-core" containerID="cri-o://ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9" gracePeriod=30 Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.083396 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="proxy-httpd" containerID="cri-o://dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1" gracePeriod=30 Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.083409 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-notification-agent" containerID="cri-o://98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770" gracePeriod=30 Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.084093 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:29:01 crc kubenswrapper[4772]: I0127 15:29:01.119666 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.400734446 podStartE2EDuration="7.119644702s" podCreationTimestamp="2026-01-27 15:28:54 +0000 UTC" firstStartedPulling="2026-01-27 15:28:55.880665514 +0000 UTC m=+1321.861274622" lastFinishedPulling="2026-01-27 15:29:00.59957578 +0000 UTC m=+1326.580184878" observedRunningTime="2026-01-27 15:29:01.107785678 +0000 UTC m=+1327.088394776" watchObservedRunningTime="2026-01-27 15:29:01.119644702 +0000 UTC m=+1327.100253810" Jan 27 15:29:02 crc kubenswrapper[4772]: I0127 15:29:02.094253 4772 generic.go:334] "Generic (PLEG): container finished" podID="2869a695-9773-4816-90d1-34f45555b442" containerID="dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1" exitCode=0 Jan 27 15:29:02 crc kubenswrapper[4772]: I0127 15:29:02.094480 4772 generic.go:334] "Generic (PLEG): container finished" podID="2869a695-9773-4816-90d1-34f45555b442" containerID="ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9" exitCode=2 Jan 27 15:29:02 crc kubenswrapper[4772]: I0127 15:29:02.094489 4772 generic.go:334] "Generic (PLEG): container finished" podID="2869a695-9773-4816-90d1-34f45555b442" containerID="98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770" exitCode=0 Jan 27 15:29:02 crc kubenswrapper[4772]: I0127 15:29:02.094328 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerDied","Data":"dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1"} Jan 27 15:29:02 crc kubenswrapper[4772]: I0127 15:29:02.094523 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerDied","Data":"ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9"} Jan 27 15:29:02 crc kubenswrapper[4772]: I0127 15:29:02.094537 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerDied","Data":"98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770"} Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.716686 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856001 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-sg-core-conf-yaml\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856213 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-run-httpd\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856269 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-scripts\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856306 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnzcc\" (UniqueName: \"kubernetes.io/projected/2869a695-9773-4816-90d1-34f45555b442-kube-api-access-tnzcc\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856360 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-log-httpd\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856462 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-combined-ca-bundle\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-config-data\") pod \"2869a695-9773-4816-90d1-34f45555b442\" (UID: \"2869a695-9773-4816-90d1-34f45555b442\") " Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856800 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.856919 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.857582 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.857614 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2869a695-9773-4816-90d1-34f45555b442-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.865537 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-scripts" (OuterVolumeSpecName: "scripts") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.865590 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2869a695-9773-4816-90d1-34f45555b442-kube-api-access-tnzcc" (OuterVolumeSpecName: "kube-api-access-tnzcc") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "kube-api-access-tnzcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.905492 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.941361 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.958711 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.958742 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.958753 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnzcc\" (UniqueName: \"kubernetes.io/projected/2869a695-9773-4816-90d1-34f45555b442-kube-api-access-tnzcc\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.958762 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:03 crc kubenswrapper[4772]: I0127 15:29:03.977302 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-config-data" (OuterVolumeSpecName: "config-data") pod "2869a695-9773-4816-90d1-34f45555b442" (UID: "2869a695-9773-4816-90d1-34f45555b442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.060510 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2869a695-9773-4816-90d1-34f45555b442-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.117923 4772 generic.go:334] "Generic (PLEG): container finished" podID="2869a695-9773-4816-90d1-34f45555b442" containerID="ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310" exitCode=0 Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.117966 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerDied","Data":"ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310"} Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.117994 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2869a695-9773-4816-90d1-34f45555b442","Type":"ContainerDied","Data":"6e415f02d11a2c8f15d05a7114c5f8606abeaaf34e280ab6666002c1dea01ba4"} Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.117999 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.118013 4772 scope.go:117] "RemoveContainer" containerID="dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.156218 4772 scope.go:117] "RemoveContainer" containerID="ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.162574 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.176295 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.190283 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.190794 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="proxy-httpd" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.190809 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="proxy-httpd" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.190825 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="sg-core" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.190832 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="sg-core" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.190859 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-central-agent" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.190867 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-central-agent" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.190887 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-notification-agent" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.190895 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-notification-agent" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.191122 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="sg-core" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.191158 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="proxy-httpd" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.191677 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-central-agent" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.191700 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2869a695-9773-4816-90d1-34f45555b442" containerName="ceilometer-notification-agent" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.196235 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.198039 4772 scope.go:117] "RemoveContainer" containerID="98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.200742 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.201241 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.203398 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.230583 4772 scope.go:117] "RemoveContainer" containerID="ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.247767 4772 scope.go:117] "RemoveContainer" containerID="dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.248747 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1\": container with ID starting with dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1 not found: ID does not exist" containerID="dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.248806 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1"} err="failed to get container status \"dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1\": rpc error: code = NotFound desc = could not find container \"dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1\": container with ID starting with dd17ba31af2f6ba28c2a4b9d132f086ba22d76034e3e3228e522e489759ecad1 not found: ID does not exist" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.248827 4772 scope.go:117] "RemoveContainer" containerID="ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.249123 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9\": container with ID starting with ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9 not found: ID does not exist" containerID="ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.249273 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9"} err="failed to get container status \"ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9\": rpc error: code = NotFound desc = could not find container \"ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9\": container with ID starting with ab34d41bdb3da5c2ca1372f2b064ab426f9d0b9c15e4bb1bb989886eacb17cd9 not found: ID does not exist" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.249371 4772 scope.go:117] "RemoveContainer" containerID="98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.249860 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770\": container with ID starting with 98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770 not found: ID does not exist" containerID="98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.249991 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770"} err="failed to get container status \"98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770\": rpc error: code = NotFound desc = could not find container \"98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770\": container with ID starting with 98f0561ee38d8d108638c05ac609999f7f05dffc671417770e7d54c659d7c770 not found: ID does not exist" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.250100 4772 scope.go:117] "RemoveContainer" containerID="ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310" Jan 27 15:29:04 crc kubenswrapper[4772]: E0127 15:29:04.250613 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310\": container with ID starting with ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310 not found: ID does not exist" containerID="ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.250653 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310"} err="failed to get container status \"ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310\": rpc error: code = NotFound desc = could not find container \"ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310\": container with ID starting with ede1e3bccf115909c4db09d1e983ec3d2f0212adfce0f8676afa3715415fb310 not found: ID does not exist" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.365613 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.365715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-run-httpd\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.365792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27d6x\" (UniqueName: \"kubernetes.io/projected/3ebbc7c3-09c3-4524-854b-e0d64400ab93-kube-api-access-27d6x\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.365920 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-scripts\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.366008 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-log-httpd\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.366046 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-config-data\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.366102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.467655 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-run-httpd\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27d6x\" (UniqueName: \"kubernetes.io/projected/3ebbc7c3-09c3-4524-854b-e0d64400ab93-kube-api-access-27d6x\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-run-httpd\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-scripts\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468547 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-log-httpd\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468726 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-config-data\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468832 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-log-httpd\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.468913 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.469248 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.473260 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.474293 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-scripts\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.476130 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-config-data\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.482885 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.486474 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27d6x\" (UniqueName: \"kubernetes.io/projected/3ebbc7c3-09c3-4524-854b-e0d64400ab93-kube-api-access-27d6x\") pod \"ceilometer-0\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.519927 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:04 crc kubenswrapper[4772]: I0127 15:29:04.686385 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2869a695-9773-4816-90d1-34f45555b442" path="/var/lib/kubelet/pods/2869a695-9773-4816-90d1-34f45555b442/volumes" Jan 27 15:29:05 crc kubenswrapper[4772]: I0127 15:29:05.020611 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:05 crc kubenswrapper[4772]: I0127 15:29:05.128499 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerStarted","Data":"78cb40c125f03659c88535f65c13efc1a1776dd09e65791b32311c60251b3ac1"} Jan 27 15:29:06 crc kubenswrapper[4772]: I0127 15:29:06.140651 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerStarted","Data":"c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93"} Jan 27 15:29:07 crc kubenswrapper[4772]: I0127 15:29:07.154358 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerStarted","Data":"83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398"} Jan 27 15:29:07 crc kubenswrapper[4772]: I0127 15:29:07.154861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerStarted","Data":"3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9"} Jan 27 15:29:08 crc kubenswrapper[4772]: E0127 15:29:08.467774 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:08 crc kubenswrapper[4772]: E0127 15:29:08.469378 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:08 crc kubenswrapper[4772]: E0127 15:29:08.471056 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:08 crc kubenswrapper[4772]: E0127 15:29:08.471103 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:09 crc kubenswrapper[4772]: I0127 15:29:09.172014 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerStarted","Data":"c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502"} Jan 27 15:29:09 crc kubenswrapper[4772]: I0127 15:29:09.172527 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:29:09 crc kubenswrapper[4772]: I0127 15:29:09.208981 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.653556471 podStartE2EDuration="5.208950824s" podCreationTimestamp="2026-01-27 15:29:04 +0000 UTC" firstStartedPulling="2026-01-27 15:29:05.019574665 +0000 UTC m=+1331.000183763" lastFinishedPulling="2026-01-27 15:29:08.574969008 +0000 UTC m=+1334.555578116" observedRunningTime="2026-01-27 15:29:09.194892705 +0000 UTC m=+1335.175501863" watchObservedRunningTime="2026-01-27 15:29:09.208950824 +0000 UTC m=+1335.189559962" Jan 27 15:29:12 crc kubenswrapper[4772]: I0127 15:29:12.058697 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:29:12 crc kubenswrapper[4772]: I0127 15:29:12.059470 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:29:13 crc kubenswrapper[4772]: E0127 15:29:13.467120 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:13 crc kubenswrapper[4772]: E0127 15:29:13.468486 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:13 crc kubenswrapper[4772]: E0127 15:29:13.469730 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:13 crc kubenswrapper[4772]: E0127 15:29:13.469784 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:18 crc kubenswrapper[4772]: E0127 15:29:18.466939 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:18 crc kubenswrapper[4772]: E0127 15:29:18.469292 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:18 crc kubenswrapper[4772]: E0127 15:29:18.475896 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:18 crc kubenswrapper[4772]: E0127 15:29:18.475985 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:23 crc kubenswrapper[4772]: E0127 15:29:23.467379 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:23 crc kubenswrapper[4772]: E0127 15:29:23.471993 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:23 crc kubenswrapper[4772]: E0127 15:29:23.474686 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:23 crc kubenswrapper[4772]: E0127 15:29:23.474797 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:28 crc kubenswrapper[4772]: E0127 15:29:28.467670 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:28 crc kubenswrapper[4772]: E0127 15:29:28.469868 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:28 crc kubenswrapper[4772]: E0127 15:29:28.471927 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:29:28 crc kubenswrapper[4772]: E0127 15:29:28.472011 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.374344 4772 generic.go:334] "Generic (PLEG): container finished" podID="bd9ac534-7732-417d-81a3-573fe821b26d" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" exitCode=137 Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.374432 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bd9ac534-7732-417d-81a3-573fe821b26d","Type":"ContainerDied","Data":"d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7"} Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.485379 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.586412 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-config-data\") pod \"bd9ac534-7732-417d-81a3-573fe821b26d\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.586614 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txnx2\" (UniqueName: \"kubernetes.io/projected/bd9ac534-7732-417d-81a3-573fe821b26d-kube-api-access-txnx2\") pod \"bd9ac534-7732-417d-81a3-573fe821b26d\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.586659 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle\") pod \"bd9ac534-7732-417d-81a3-573fe821b26d\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.608679 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9ac534-7732-417d-81a3-573fe821b26d-kube-api-access-txnx2" (OuterVolumeSpecName: "kube-api-access-txnx2") pod "bd9ac534-7732-417d-81a3-573fe821b26d" (UID: "bd9ac534-7732-417d-81a3-573fe821b26d"). InnerVolumeSpecName "kube-api-access-txnx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:30 crc kubenswrapper[4772]: E0127 15:29:30.659079 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle podName:bd9ac534-7732-417d-81a3-573fe821b26d nodeName:}" failed. No retries permitted until 2026-01-27 15:29:31.159049248 +0000 UTC m=+1357.139658356 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle") pod "bd9ac534-7732-417d-81a3-573fe821b26d" (UID: "bd9ac534-7732-417d-81a3-573fe821b26d") : error deleting /var/lib/kubelet/pods/bd9ac534-7732-417d-81a3-573fe821b26d/volume-subpaths: remove /var/lib/kubelet/pods/bd9ac534-7732-417d-81a3-573fe821b26d/volume-subpaths: no such file or directory Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.663013 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-config-data" (OuterVolumeSpecName: "config-data") pod "bd9ac534-7732-417d-81a3-573fe821b26d" (UID: "bd9ac534-7732-417d-81a3-573fe821b26d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.688971 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:30 crc kubenswrapper[4772]: I0127 15:29:30.689019 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txnx2\" (UniqueName: \"kubernetes.io/projected/bd9ac534-7732-417d-81a3-573fe821b26d-kube-api-access-txnx2\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.198935 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle\") pod \"bd9ac534-7732-417d-81a3-573fe821b26d\" (UID: \"bd9ac534-7732-417d-81a3-573fe821b26d\") " Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.202436 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd9ac534-7732-417d-81a3-573fe821b26d" (UID: "bd9ac534-7732-417d-81a3-573fe821b26d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.301062 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9ac534-7732-417d-81a3-573fe821b26d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.385201 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bd9ac534-7732-417d-81a3-573fe821b26d","Type":"ContainerDied","Data":"1fa0fda721794873849780c939c36b0ae92fda6799dd0ff519efa3b4fb4008b6"} Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.385525 4772 scope.go:117] "RemoveContainer" containerID="d0ebea91d7cab43bb0777a4dc747e6d73a7eab735ebf1eea102733e045c246c7" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.385274 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.420615 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.430906 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.447714 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:29:31 crc kubenswrapper[4772]: E0127 15:29:31.448291 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.448311 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.448523 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" containerName="nova-cell0-conductor-conductor" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.449235 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.451554 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bq7vb" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.451606 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.456691 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.607357 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kbc8\" (UniqueName: \"kubernetes.io/projected/b20b9215-5398-4100-bac4-763daa5ed222-kube-api-access-8kbc8\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.607449 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.607577 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.709137 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kbc8\" (UniqueName: \"kubernetes.io/projected/b20b9215-5398-4100-bac4-763daa5ed222-kube-api-access-8kbc8\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.709508 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.709653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.716848 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.717101 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.734702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kbc8\" (UniqueName: \"kubernetes.io/projected/b20b9215-5398-4100-bac4-763daa5ed222-kube-api-access-8kbc8\") pod \"nova-cell0-conductor-0\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:31 crc kubenswrapper[4772]: I0127 15:29:31.770595 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:32 crc kubenswrapper[4772]: I0127 15:29:32.285029 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:29:32 crc kubenswrapper[4772]: I0127 15:29:32.402238 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b20b9215-5398-4100-bac4-763daa5ed222","Type":"ContainerStarted","Data":"09c4f5b1f70267b595eb90b2b27556cd2ace28d9594da5210f17e313d0d8a29a"} Jan 27 15:29:32 crc kubenswrapper[4772]: I0127 15:29:32.681520 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd9ac534-7732-417d-81a3-573fe821b26d" path="/var/lib/kubelet/pods/bd9ac534-7732-417d-81a3-573fe821b26d/volumes" Jan 27 15:29:33 crc kubenswrapper[4772]: I0127 15:29:33.411573 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b20b9215-5398-4100-bac4-763daa5ed222","Type":"ContainerStarted","Data":"2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83"} Jan 27 15:29:33 crc kubenswrapper[4772]: I0127 15:29:33.411692 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:33 crc kubenswrapper[4772]: I0127 15:29:33.435156 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.435130801 podStartE2EDuration="2.435130801s" podCreationTimestamp="2026-01-27 15:29:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:29:33.425531332 +0000 UTC m=+1359.406140440" watchObservedRunningTime="2026-01-27 15:29:33.435130801 +0000 UTC m=+1359.415739899" Jan 27 15:29:34 crc kubenswrapper[4772]: I0127 15:29:34.536822 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.007106 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.007969 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1ef66151-0ea7-4696-9db0-7b6665731670" containerName="kube-state-metrics" containerID="cri-o://e93f9f446173d4fd985d40db28827a7f313c9dbe0522a2d3003fa93c8ac7de5e" gracePeriod=30 Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.471197 4772 generic.go:334] "Generic (PLEG): container finished" podID="1ef66151-0ea7-4696-9db0-7b6665731670" containerID="e93f9f446173d4fd985d40db28827a7f313c9dbe0522a2d3003fa93c8ac7de5e" exitCode=2 Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.471302 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ef66151-0ea7-4696-9db0-7b6665731670","Type":"ContainerDied","Data":"e93f9f446173d4fd985d40db28827a7f313c9dbe0522a2d3003fa93c8ac7de5e"} Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.471611 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1ef66151-0ea7-4696-9db0-7b6665731670","Type":"ContainerDied","Data":"fabcd309d9b92ca01d4a1240a11210e76a8e365a872f6471e0b9d641c3e1ff39"} Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.471633 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fabcd309d9b92ca01d4a1240a11210e76a8e365a872f6471e0b9d641c3e1ff39" Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.520699 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.680770 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znb2g\" (UniqueName: \"kubernetes.io/projected/1ef66151-0ea7-4696-9db0-7b6665731670-kube-api-access-znb2g\") pod \"1ef66151-0ea7-4696-9db0-7b6665731670\" (UID: \"1ef66151-0ea7-4696-9db0-7b6665731670\") " Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.688683 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef66151-0ea7-4696-9db0-7b6665731670-kube-api-access-znb2g" (OuterVolumeSpecName: "kube-api-access-znb2g") pod "1ef66151-0ea7-4696-9db0-7b6665731670" (UID: "1ef66151-0ea7-4696-9db0-7b6665731670"). InnerVolumeSpecName "kube-api-access-znb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:38 crc kubenswrapper[4772]: I0127 15:29:38.782628 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znb2g\" (UniqueName: \"kubernetes.io/projected/1ef66151-0ea7-4696-9db0-7b6665731670-kube-api-access-znb2g\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.479204 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.516225 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.526585 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.545831 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:29:39 crc kubenswrapper[4772]: E0127 15:29:39.546611 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef66151-0ea7-4696-9db0-7b6665731670" containerName="kube-state-metrics" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.546935 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef66151-0ea7-4696-9db0-7b6665731670" containerName="kube-state-metrics" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.547263 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef66151-0ea7-4696-9db0-7b6665731670" containerName="kube-state-metrics" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.548085 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.550428 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.550593 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.564738 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.698980 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.699043 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.699081 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.699191 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblgh\" (UniqueName: \"kubernetes.io/projected/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-api-access-nblgh\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.745272 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.745597 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-central-agent" containerID="cri-o://c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93" gracePeriod=30 Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.745735 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="proxy-httpd" containerID="cri-o://c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502" gracePeriod=30 Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.745795 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="sg-core" containerID="cri-o://83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398" gracePeriod=30 Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.745841 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-notification-agent" containerID="cri-o://3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9" gracePeriod=30 Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.800826 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblgh\" (UniqueName: \"kubernetes.io/projected/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-api-access-nblgh\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.801288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.801400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.801507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.807322 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.807418 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.813418 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.822623 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblgh\" (UniqueName: \"kubernetes.io/projected/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-api-access-nblgh\") pod \"kube-state-metrics-0\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " pod="openstack/kube-state-metrics-0" Jan 27 15:29:39 crc kubenswrapper[4772]: I0127 15:29:39.876239 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:29:40 crc kubenswrapper[4772]: W0127 15:29:40.329570 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21f54218_5889_4ae9_a7a1_7ed4895ad63c.slice/crio-cf3bc864ff0528c25cfa09a147802c26a644517c099a85fc5bafd7c4da9534c3 WatchSource:0}: Error finding container cf3bc864ff0528c25cfa09a147802c26a644517c099a85fc5bafd7c4da9534c3: Status 404 returned error can't find the container with id cf3bc864ff0528c25cfa09a147802c26a644517c099a85fc5bafd7c4da9534c3 Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.331733 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.332115 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.492594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"21f54218-5889-4ae9-a7a1-7ed4895ad63c","Type":"ContainerStarted","Data":"cf3bc864ff0528c25cfa09a147802c26a644517c099a85fc5bafd7c4da9534c3"} Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.496966 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerID="c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502" exitCode=0 Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.497002 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerID="83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398" exitCode=2 Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.497014 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerID="c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93" exitCode=0 Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.497036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerDied","Data":"c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502"} Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.497064 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerDied","Data":"83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398"} Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.497076 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerDied","Data":"c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93"} Jan 27 15:29:40 crc kubenswrapper[4772]: I0127 15:29:40.674751 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef66151-0ea7-4696-9db0-7b6665731670" path="/var/lib/kubelet/pods/1ef66151-0ea7-4696-9db0-7b6665731670/volumes" Jan 27 15:29:41 crc kubenswrapper[4772]: I0127 15:29:41.506826 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"21f54218-5889-4ae9-a7a1-7ed4895ad63c","Type":"ContainerStarted","Data":"670d5287e2a9882bc2137122191964eb76c57b36df9c904f50db621c1141ab98"} Jan 27 15:29:41 crc kubenswrapper[4772]: I0127 15:29:41.507411 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 27 15:29:41 crc kubenswrapper[4772]: I0127 15:29:41.530936 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.128304677 podStartE2EDuration="2.530913812s" podCreationTimestamp="2026-01-27 15:29:39 +0000 UTC" firstStartedPulling="2026-01-27 15:29:40.331783248 +0000 UTC m=+1366.312392346" lastFinishedPulling="2026-01-27 15:29:40.734392373 +0000 UTC m=+1366.715001481" observedRunningTime="2026-01-27 15:29:41.523129397 +0000 UTC m=+1367.503738505" watchObservedRunningTime="2026-01-27 15:29:41.530913812 +0000 UTC m=+1367.511522910" Jan 27 15:29:41 crc kubenswrapper[4772]: I0127 15:29:41.799622 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.059071 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.059147 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.059214 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.059730 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d1c45659af37dbb5fcad6152d119ca4f804c58006a54555795ff000f3b7aea9"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.059788 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://1d1c45659af37dbb5fcad6152d119ca4f804c58006a54555795ff000f3b7aea9" gracePeriod=600 Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.439038 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h5ch7"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.440100 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.443316 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.448664 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.453381 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5ch7"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.519664 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="1d1c45659af37dbb5fcad6152d119ca4f804c58006a54555795ff000f3b7aea9" exitCode=0 Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.519737 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"1d1c45659af37dbb5fcad6152d119ca4f804c58006a54555795ff000f3b7aea9"} Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.519794 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2"} Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.519814 4772 scope.go:117] "RemoveContainer" containerID="ed9bc8d4920540552bc96f7af996996e69c893224418d74c897e7298ed107163" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.549214 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-scripts\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.549303 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-config-data\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.549345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.549376 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftl7j\" (UniqueName: \"kubernetes.io/projected/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-kube-api-access-ftl7j\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.638936 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.640337 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.644951 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.651544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-scripts\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.651651 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-config-data\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.651737 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.651804 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftl7j\" (UniqueName: \"kubernetes.io/projected/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-kube-api-access-ftl7j\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.659402 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.660855 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.667797 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-scripts\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.668051 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.668339 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.690861 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-config-data\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.701400 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.722187 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftl7j\" (UniqueName: \"kubernetes.io/projected/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-kube-api-access-ftl7j\") pod \"nova-cell0-cell-mapping-h5ch7\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.726241 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756465 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgjrt\" (UniqueName: \"kubernetes.io/projected/8a218ca3-a163-4f9c-8e73-f630b2228bb2-kube-api-access-hgjrt\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756591 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-config-data\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756619 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vn6\" (UniqueName: \"kubernetes.io/projected/a79db711-df56-46f6-93c2-7e1e5c914ba6-kube-api-access-t2vn6\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756675 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79db711-df56-46f6-93c2-7e1e5c914ba6-logs\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756813 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.756927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.767789 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.781053 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.783495 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.787682 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.834452 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.858430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.858885 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgjrt\" (UniqueName: \"kubernetes.io/projected/8a218ca3-a163-4f9c-8e73-f630b2228bb2-kube-api-access-hgjrt\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.858946 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-config-data\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.858972 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vn6\" (UniqueName: \"kubernetes.io/projected/a79db711-df56-46f6-93c2-7e1e5c914ba6-kube-api-access-t2vn6\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.859004 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79db711-df56-46f6-93c2-7e1e5c914ba6-logs\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.859061 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.859103 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.861292 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79db711-df56-46f6-93c2-7e1e5c914ba6-logs\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.877128 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.887597 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-config-data\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.889890 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.890321 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.907707 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgjrt\" (UniqueName: \"kubernetes.io/projected/8a218ca3-a163-4f9c-8e73-f630b2228bb2-kube-api-access-hgjrt\") pod \"nova-cell1-novncproxy-0\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.908295 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vn6\" (UniqueName: \"kubernetes.io/projected/a79db711-df56-46f6-93c2-7e1e5c914ba6-kube-api-access-t2vn6\") pod \"nova-api-0\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " pod="openstack/nova-api-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.912928 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.914386 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.917898 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.941436 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.964400 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-config-data\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.964441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.964481 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xml5\" (UniqueName: \"kubernetes.io/projected/325e0266-afde-43a7-b77c-4b29a2d55c3a-kube-api-access-6xml5\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.964604 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325e0266-afde-43a7-b77c-4b29a2d55c3a-logs\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.983982 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-88t2p"] Jan 27 15:29:42 crc kubenswrapper[4772]: I0127 15:29:42.985553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.002020 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-88t2p"] Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.065952 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-config-data\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.065999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.066038 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-config-data\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.066063 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xml5\" (UniqueName: \"kubernetes.io/projected/325e0266-afde-43a7-b77c-4b29a2d55c3a-kube-api-access-6xml5\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.066084 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325e0266-afde-43a7-b77c-4b29a2d55c3a-logs\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.066191 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx5gw\" (UniqueName: \"kubernetes.io/projected/2bfae3d5-7017-4ced-9691-4255769c51f6-kube-api-access-cx5gw\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.066213 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.068595 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325e0266-afde-43a7-b77c-4b29a2d55c3a-logs\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.070023 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-config-data\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.070096 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.098553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.098831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xml5\" (UniqueName: \"kubernetes.io/projected/325e0266-afde-43a7-b77c-4b29a2d55c3a-kube-api-access-6xml5\") pod \"nova-metadata-0\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.136635 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.161272 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.172682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.172743 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-config-data\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.172780 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-config\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.172882 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.173188 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.173296 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx5gw\" (UniqueName: \"kubernetes.io/projected/2bfae3d5-7017-4ced-9691-4255769c51f6-kube-api-access-cx5gw\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.173341 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.173435 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.173466 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2fc\" (UniqueName: \"kubernetes.io/projected/73ee81ee-57fa-466a-8ada-2fa4da5987a0-kube-api-access-vf2fc\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.182414 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-config-data\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.182718 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.210989 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx5gw\" (UniqueName: \"kubernetes.io/projected/2bfae3d5-7017-4ced-9691-4255769c51f6-kube-api-access-cx5gw\") pod \"nova-scheduler-0\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.275408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.275531 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.275585 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.275605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2fc\" (UniqueName: \"kubernetes.io/projected/73ee81ee-57fa-466a-8ada-2fa4da5987a0-kube-api-access-vf2fc\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.275657 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.275699 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-config\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.277339 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-config\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.277938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.278586 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.280476 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.283557 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.283932 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.329811 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2fc\" (UniqueName: \"kubernetes.io/projected/73ee81ee-57fa-466a-8ada-2fa4da5987a0-kube-api-access-vf2fc\") pod \"dnsmasq-dns-845d6d6f59-88t2p\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.478515 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.553021 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerID="3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9" exitCode=0 Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.553320 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.553348 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerDied","Data":"3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9"} Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.553684 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ebbc7c3-09c3-4524-854b-e0d64400ab93","Type":"ContainerDied","Data":"78cb40c125f03659c88535f65c13efc1a1776dd09e65791b32311c60251b3ac1"} Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.553713 4772 scope.go:117] "RemoveContainer" containerID="c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.583487 4772 scope.go:117] "RemoveContainer" containerID="83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584213 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-run-httpd\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-config-data\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584370 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-log-httpd\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-sg-core-conf-yaml\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584549 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-scripts\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-combined-ca-bundle\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.584697 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27d6x\" (UniqueName: \"kubernetes.io/projected/3ebbc7c3-09c3-4524-854b-e0d64400ab93-kube-api-access-27d6x\") pod \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\" (UID: \"3ebbc7c3-09c3-4524-854b-e0d64400ab93\") " Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.585428 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.585791 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.602456 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-scripts" (OuterVolumeSpecName: "scripts") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.625865 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.640705 4772 scope.go:117] "RemoveContainer" containerID="3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.640932 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ebbc7c3-09c3-4524-854b-e0d64400ab93-kube-api-access-27d6x" (OuterVolumeSpecName: "kube-api-access-27d6x") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "kube-api-access-27d6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.652557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.694671 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.694703 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.694718 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.694732 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27d6x\" (UniqueName: \"kubernetes.io/projected/3ebbc7c3-09c3-4524-854b-e0d64400ab93-kube-api-access-27d6x\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.694744 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ebbc7c3-09c3-4524-854b-e0d64400ab93-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.697122 4772 scope.go:117] "RemoveContainer" containerID="c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.761439 4772 scope.go:117] "RemoveContainer" containerID="c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.763914 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502\": container with ID starting with c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502 not found: ID does not exist" containerID="c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.763957 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502"} err="failed to get container status \"c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502\": rpc error: code = NotFound desc = could not find container \"c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502\": container with ID starting with c37e3598e7c9e4e7088ad7d7ea7e00efe5c231d10378509f82fb7d2b9df0b502 not found: ID does not exist" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.763986 4772 scope.go:117] "RemoveContainer" containerID="83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.764555 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398\": container with ID starting with 83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398 not found: ID does not exist" containerID="83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.764607 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398"} err="failed to get container status \"83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398\": rpc error: code = NotFound desc = could not find container \"83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398\": container with ID starting with 83aac8ec33dfd5bc2c51a863d771b4120a23b62bf31ab0c79c0cd15ea11a7398 not found: ID does not exist" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.764621 4772 scope.go:117] "RemoveContainer" containerID="3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.764965 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9\": container with ID starting with 3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9 not found: ID does not exist" containerID="3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.764994 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9"} err="failed to get container status \"3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9\": rpc error: code = NotFound desc = could not find container \"3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9\": container with ID starting with 3b39e9533e1851e42935b988f031239d201f08489a2d644eb389253a63ab74a9 not found: ID does not exist" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.765007 4772 scope.go:117] "RemoveContainer" containerID="c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.765813 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93\": container with ID starting with c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93 not found: ID does not exist" containerID="c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.765860 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93"} err="failed to get container status \"c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93\": rpc error: code = NotFound desc = could not find container \"c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93\": container with ID starting with c83d73369af2ff787dcb6d403df3f89cc18f8c344ca9a168a243d6708efe0e93 not found: ID does not exist" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.783495 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.790553 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-config-data" (OuterVolumeSpecName: "config-data") pod "3ebbc7c3-09c3-4524-854b-e0d64400ab93" (UID: "3ebbc7c3-09c3-4524-854b-e0d64400ab93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.822369 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.822435 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ebbc7c3-09c3-4524-854b-e0d64400ab93-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:43 crc kubenswrapper[4772]: W0127 15:29:43.822582 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea2e7e0f_aef9_4687_932c_d21f24fd4bff.slice/crio-68f7be70d41a760b9715eb4d90633fa6d0b418abb8ac102df2c4b1b0cee8fa86 WatchSource:0}: Error finding container 68f7be70d41a760b9715eb4d90633fa6d0b418abb8ac102df2c4b1b0cee8fa86: Status 404 returned error can't find the container with id 68f7be70d41a760b9715eb4d90633fa6d0b418abb8ac102df2c4b1b0cee8fa86 Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.845595 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5ch7"] Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.873986 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:29:43 crc kubenswrapper[4772]: W0127 15:29:43.877809 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79db711_df56_46f6_93c2_7e1e5c914ba6.slice/crio-ea5b86feefaf42849d7aeffa2e275c403579d3f00ecb406df1ac2ae06ceefa44 WatchSource:0}: Error finding container ea5b86feefaf42849d7aeffa2e275c403579d3f00ecb406df1ac2ae06ceefa44: Status 404 returned error can't find the container with id ea5b86feefaf42849d7aeffa2e275c403579d3f00ecb406df1ac2ae06ceefa44 Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.885886 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:29:43 crc kubenswrapper[4772]: W0127 15:29:43.910719 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a218ca3_a163_4f9c_8e73_f630b2228bb2.slice/crio-d661ce2ba7ef10af0b86479b7eca76159de7a8a82826e702dc8a584607cd118c WatchSource:0}: Error finding container d661ce2ba7ef10af0b86479b7eca76159de7a8a82826e702dc8a584607cd118c: Status 404 returned error can't find the container with id d661ce2ba7ef10af0b86479b7eca76159de7a8a82826e702dc8a584607cd118c Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.935661 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjwh2"] Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.936086 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="sg-core" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936098 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="sg-core" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.936112 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-notification-agent" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936118 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-notification-agent" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.936127 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="proxy-httpd" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936133 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="proxy-httpd" Jan 27 15:29:43 crc kubenswrapper[4772]: E0127 15:29:43.936161 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-central-agent" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936179 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-central-agent" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936431 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="sg-core" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936453 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="proxy-httpd" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936463 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-central-agent" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.936475 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" containerName="ceilometer-notification-agent" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.937132 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.942046 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.942278 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 15:29:43 crc kubenswrapper[4772]: I0127 15:29:43.971476 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjwh2"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.001308 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.021258 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.028350 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.028434 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nh52\" (UniqueName: \"kubernetes.io/projected/f91bfd1b-6386-444f-95da-045fbe957f5c-kube-api-access-7nh52\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.028550 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-config-data\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.028629 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-scripts\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.040258 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.042921 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.045675 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.045938 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.046140 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.056534 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.084653 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.108978 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130324 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-config-data\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130361 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-scripts\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130399 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130421 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-log-httpd\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130453 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-scripts\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-run-httpd\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130541 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130563 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-config-data\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nh52\" (UniqueName: \"kubernetes.io/projected/f91bfd1b-6386-444f-95da-045fbe957f5c-kube-api-access-7nh52\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130601 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130641 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br29b\" (UniqueName: \"kubernetes.io/projected/445d3e38-8f68-4dad-9e97-d927d60ee1e4-kube-api-access-br29b\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.130663 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.134613 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-config-data\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.135811 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-scripts\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.136030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.149865 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nh52\" (UniqueName: \"kubernetes.io/projected/f91bfd1b-6386-444f-95da-045fbe957f5c-kube-api-access-7nh52\") pod \"nova-cell1-conductor-db-sync-gjwh2\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.232761 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-run-httpd\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.232830 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-config-data\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.232865 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.232929 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br29b\" (UniqueName: \"kubernetes.io/projected/445d3e38-8f68-4dad-9e97-d927d60ee1e4-kube-api-access-br29b\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.232969 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.233012 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-scripts\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.233058 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.233089 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-log-httpd\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.233778 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-log-httpd\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.235139 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-run-httpd\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.242521 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.245387 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.248271 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-config-data\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.251857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.252804 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-scripts\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.269073 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br29b\" (UniqueName: \"kubernetes.io/projected/445d3e38-8f68-4dad-9e97-d927d60ee1e4-kube-api-access-br29b\") pod \"ceilometer-0\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.276811 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.304726 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-88t2p"] Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.399076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.597833 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8a218ca3-a163-4f9c-8e73-f630b2228bb2","Type":"ContainerStarted","Data":"d661ce2ba7ef10af0b86479b7eca76159de7a8a82826e702dc8a584607cd118c"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.610256 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bfae3d5-7017-4ced-9691-4255769c51f6","Type":"ContainerStarted","Data":"1941b03ee46ef8231c28748227a7ca5084953c3a384610a809f7eb266c2d086a"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.626142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5ch7" event={"ID":"ea2e7e0f-aef9-4687-932c-d21f24fd4bff","Type":"ContainerStarted","Data":"bea9ecc5c8bd7f22996f379a16987a5468d25478afcbfdd986751cd73382ded7"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.626221 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5ch7" event={"ID":"ea2e7e0f-aef9-4687-932c-d21f24fd4bff","Type":"ContainerStarted","Data":"68f7be70d41a760b9715eb4d90633fa6d0b418abb8ac102df2c4b1b0cee8fa86"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.659882 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a79db711-df56-46f6-93c2-7e1e5c914ba6","Type":"ContainerStarted","Data":"ea5b86feefaf42849d7aeffa2e275c403579d3f00ecb406df1ac2ae06ceefa44"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.660309 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h5ch7" podStartSLOduration=2.6602895159999997 podStartE2EDuration="2.660289516s" podCreationTimestamp="2026-01-27 15:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:29:44.649274608 +0000 UTC m=+1370.629883706" watchObservedRunningTime="2026-01-27 15:29:44.660289516 +0000 UTC m=+1370.640898614" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.720538 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ebbc7c3-09c3-4524-854b-e0d64400ab93" path="/var/lib/kubelet/pods/3ebbc7c3-09c3-4524-854b-e0d64400ab93/volumes" Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.721389 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" event={"ID":"73ee81ee-57fa-466a-8ada-2fa4da5987a0","Type":"ContainerStarted","Data":"79cc249e145e1f853047b2402d998a0aa111ca080edd461a766047221e313d63"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.721414 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"325e0266-afde-43a7-b77c-4b29a2d55c3a","Type":"ContainerStarted","Data":"0fbb6b12d9611d132f4209f04274b5892a26d8ef07f59ee5c52d51242f78a833"} Jan 27 15:29:44 crc kubenswrapper[4772]: I0127 15:29:44.857019 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjwh2"] Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.005610 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:29:45 crc kubenswrapper[4772]: W0127 15:29:45.007460 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod445d3e38_8f68_4dad_9e97_d927d60ee1e4.slice/crio-5f55b8adcf12a6b66983b6beb58f8b720085a1c824b9e5e763f7ea5a2b511df2 WatchSource:0}: Error finding container 5f55b8adcf12a6b66983b6beb58f8b720085a1c824b9e5e763f7ea5a2b511df2: Status 404 returned error can't find the container with id 5f55b8adcf12a6b66983b6beb58f8b720085a1c824b9e5e763f7ea5a2b511df2 Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.699081 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" event={"ID":"f91bfd1b-6386-444f-95da-045fbe957f5c","Type":"ContainerStarted","Data":"e1df482be0829e766abad6c9eb6842ba0e9d9f6fb517127a47f819ce8b296c7d"} Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.699589 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" event={"ID":"f91bfd1b-6386-444f-95da-045fbe957f5c","Type":"ContainerStarted","Data":"a1515e80dd84c3d7eeadb4bf57668100c5ba9a6a2123be07618feb4932b00619"} Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.703348 4772 generic.go:334] "Generic (PLEG): container finished" podID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerID="4148bc33036bbf1369f8777e53357b14d6ea084f15c60f06da4d4195151a1ddd" exitCode=0 Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.703531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" event={"ID":"73ee81ee-57fa-466a-8ada-2fa4da5987a0","Type":"ContainerDied","Data":"4148bc33036bbf1369f8777e53357b14d6ea084f15c60f06da4d4195151a1ddd"} Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.709586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerStarted","Data":"5f55b8adcf12a6b66983b6beb58f8b720085a1c824b9e5e763f7ea5a2b511df2"} Jan 27 15:29:45 crc kubenswrapper[4772]: I0127 15:29:45.726681 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" podStartSLOduration=2.726665913 podStartE2EDuration="2.726665913s" podCreationTimestamp="2026-01-27 15:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:29:45.724714007 +0000 UTC m=+1371.705323115" watchObservedRunningTime="2026-01-27 15:29:45.726665913 +0000 UTC m=+1371.707275011" Jan 27 15:29:46 crc kubenswrapper[4772]: I0127 15:29:46.734892 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" event={"ID":"73ee81ee-57fa-466a-8ada-2fa4da5987a0","Type":"ContainerStarted","Data":"e6ff45d04539aabc690d9ba73108fbdca7b2759433b3c528ba073567c047da4f"} Jan 27 15:29:46 crc kubenswrapper[4772]: I0127 15:29:46.766613 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" podStartSLOduration=4.766594985 podStartE2EDuration="4.766594985s" podCreationTimestamp="2026-01-27 15:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:29:46.760643863 +0000 UTC m=+1372.741252981" watchObservedRunningTime="2026-01-27 15:29:46.766594985 +0000 UTC m=+1372.747204083" Jan 27 15:29:46 crc kubenswrapper[4772]: I0127 15:29:46.983639 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:29:46 crc kubenswrapper[4772]: I0127 15:29:46.994158 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:29:47 crc kubenswrapper[4772]: I0127 15:29:47.750873 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.762531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerStarted","Data":"7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed"} Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.764359 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"325e0266-afde-43a7-b77c-4b29a2d55c3a","Type":"ContainerStarted","Data":"f65fe58604f2e802f9b737bcbd1b97003be50f6c492456675251d85de107de5f"} Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.764406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"325e0266-afde-43a7-b77c-4b29a2d55c3a","Type":"ContainerStarted","Data":"dea9ce36986750defb2b297fb876194a33a2f0aaee4b8e21995a37e01e57d98a"} Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.764440 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-log" containerID="cri-o://dea9ce36986750defb2b297fb876194a33a2f0aaee4b8e21995a37e01e57d98a" gracePeriod=30 Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.764503 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-metadata" containerID="cri-o://f65fe58604f2e802f9b737bcbd1b97003be50f6c492456675251d85de107de5f" gracePeriod=30 Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.767331 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8a218ca3-a163-4f9c-8e73-f630b2228bb2","Type":"ContainerStarted","Data":"dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49"} Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.767452 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8a218ca3-a163-4f9c-8e73-f630b2228bb2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49" gracePeriod=30 Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.769966 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bfae3d5-7017-4ced-9691-4255769c51f6","Type":"ContainerStarted","Data":"63a9ec332e4c6035116beb1bf14c43d1297eb6d5bd2c76280996d80acf99516a"} Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.777682 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a79db711-df56-46f6-93c2-7e1e5c914ba6","Type":"ContainerStarted","Data":"441883730f3253404e2e897259d3ab7a0d38be7920d10151b1f2a7d3a8e23ae9"} Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.796632 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.741714889 podStartE2EDuration="6.79660952s" podCreationTimestamp="2026-01-27 15:29:42 +0000 UTC" firstStartedPulling="2026-01-27 15:29:44.047129026 +0000 UTC m=+1370.027738134" lastFinishedPulling="2026-01-27 15:29:48.102023667 +0000 UTC m=+1374.082632765" observedRunningTime="2026-01-27 15:29:48.789619088 +0000 UTC m=+1374.770228186" watchObservedRunningTime="2026-01-27 15:29:48.79660952 +0000 UTC m=+1374.777218618" Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.811566 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.651844293 podStartE2EDuration="6.811543182s" podCreationTimestamp="2026-01-27 15:29:42 +0000 UTC" firstStartedPulling="2026-01-27 15:29:43.945438058 +0000 UTC m=+1369.926047166" lastFinishedPulling="2026-01-27 15:29:48.105136947 +0000 UTC m=+1374.085746055" observedRunningTime="2026-01-27 15:29:48.809161663 +0000 UTC m=+1374.789770761" watchObservedRunningTime="2026-01-27 15:29:48.811543182 +0000 UTC m=+1374.792152280" Jan 27 15:29:48 crc kubenswrapper[4772]: I0127 15:29:48.837371 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.832456401 podStartE2EDuration="6.837350387s" podCreationTimestamp="2026-01-27 15:29:42 +0000 UTC" firstStartedPulling="2026-01-27 15:29:44.120439365 +0000 UTC m=+1370.101048463" lastFinishedPulling="2026-01-27 15:29:48.125333351 +0000 UTC m=+1374.105942449" observedRunningTime="2026-01-27 15:29:48.830329584 +0000 UTC m=+1374.810938692" watchObservedRunningTime="2026-01-27 15:29:48.837350387 +0000 UTC m=+1374.817959485" Jan 27 15:29:49 crc kubenswrapper[4772]: I0127 15:29:49.788492 4772 generic.go:334] "Generic (PLEG): container finished" podID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerID="dea9ce36986750defb2b297fb876194a33a2f0aaee4b8e21995a37e01e57d98a" exitCode=143 Jan 27 15:29:49 crc kubenswrapper[4772]: I0127 15:29:49.788605 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"325e0266-afde-43a7-b77c-4b29a2d55c3a","Type":"ContainerDied","Data":"dea9ce36986750defb2b297fb876194a33a2f0aaee4b8e21995a37e01e57d98a"} Jan 27 15:29:49 crc kubenswrapper[4772]: I0127 15:29:49.790960 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a79db711-df56-46f6-93c2-7e1e5c914ba6","Type":"ContainerStarted","Data":"b80017bd45d285abaa0954756e4b6dd746142e879b1c063421cc148826375061"} Jan 27 15:29:49 crc kubenswrapper[4772]: I0127 15:29:49.795177 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerStarted","Data":"20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504"} Jan 27 15:29:49 crc kubenswrapper[4772]: I0127 15:29:49.810424 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.600531868 podStartE2EDuration="7.810404997s" podCreationTimestamp="2026-01-27 15:29:42 +0000 UTC" firstStartedPulling="2026-01-27 15:29:43.893499067 +0000 UTC m=+1369.874108165" lastFinishedPulling="2026-01-27 15:29:48.103372196 +0000 UTC m=+1374.083981294" observedRunningTime="2026-01-27 15:29:49.809279324 +0000 UTC m=+1375.789888442" watchObservedRunningTime="2026-01-27 15:29:49.810404997 +0000 UTC m=+1375.791014095" Jan 27 15:29:49 crc kubenswrapper[4772]: I0127 15:29:49.901662 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 27 15:29:50 crc kubenswrapper[4772]: I0127 15:29:50.808156 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerStarted","Data":"a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8"} Jan 27 15:29:52 crc kubenswrapper[4772]: I0127 15:29:52.828503 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea2e7e0f-aef9-4687-932c-d21f24fd4bff" containerID="bea9ecc5c8bd7f22996f379a16987a5468d25478afcbfdd986751cd73382ded7" exitCode=0 Jan 27 15:29:52 crc kubenswrapper[4772]: I0127 15:29:52.828586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5ch7" event={"ID":"ea2e7e0f-aef9-4687-932c-d21f24fd4bff","Type":"ContainerDied","Data":"bea9ecc5c8bd7f22996f379a16987a5468d25478afcbfdd986751cd73382ded7"} Jan 27 15:29:52 crc kubenswrapper[4772]: I0127 15:29:52.832403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerStarted","Data":"a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481"} Jan 27 15:29:52 crc kubenswrapper[4772]: I0127 15:29:52.833619 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:29:52 crc kubenswrapper[4772]: I0127 15:29:52.869816 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.143749896 podStartE2EDuration="9.869795749s" podCreationTimestamp="2026-01-27 15:29:43 +0000 UTC" firstStartedPulling="2026-01-27 15:29:45.012108703 +0000 UTC m=+1370.992717801" lastFinishedPulling="2026-01-27 15:29:51.738154556 +0000 UTC m=+1377.718763654" observedRunningTime="2026-01-27 15:29:52.86911618 +0000 UTC m=+1378.849725288" watchObservedRunningTime="2026-01-27 15:29:52.869795749 +0000 UTC m=+1378.850404847" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.100056 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.100560 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.138722 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.164182 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.164259 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.285329 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.285374 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.327086 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.628375 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.701758 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2849v"] Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.702042 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-2849v" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerName="dnsmasq-dns" containerID="cri-o://b3b0580b2d9a989010c2055ae938a024c281531976b862414fc303ffddcf01e5" gracePeriod=10 Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.851586 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerID="b3b0580b2d9a989010c2055ae938a024c281531976b862414fc303ffddcf01e5" exitCode=0 Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.851676 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2849v" event={"ID":"4ab060da-8587-413a-a410-ee0e9cec40c6","Type":"ContainerDied","Data":"b3b0580b2d9a989010c2055ae938a024c281531976b862414fc303ffddcf01e5"} Jan 27 15:29:53 crc kubenswrapper[4772]: I0127 15:29:53.906055 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.184376 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.185285 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.413378 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.423647 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534523 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftl7j\" (UniqueName: \"kubernetes.io/projected/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-kube-api-access-ftl7j\") pod \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534609 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-svc\") pod \"4ab060da-8587-413a-a410-ee0e9cec40c6\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534689 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-config-data\") pod \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534757 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-nb\") pod \"4ab060da-8587-413a-a410-ee0e9cec40c6\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534788 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-config\") pod \"4ab060da-8587-413a-a410-ee0e9cec40c6\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-swift-storage-0\") pod \"4ab060da-8587-413a-a410-ee0e9cec40c6\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534827 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-scripts\") pod \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534856 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s774r\" (UniqueName: \"kubernetes.io/projected/4ab060da-8587-413a-a410-ee0e9cec40c6-kube-api-access-s774r\") pod \"4ab060da-8587-413a-a410-ee0e9cec40c6\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534887 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-sb\") pod \"4ab060da-8587-413a-a410-ee0e9cec40c6\" (UID: \"4ab060da-8587-413a-a410-ee0e9cec40c6\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.534907 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-combined-ca-bundle\") pod \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\" (UID: \"ea2e7e0f-aef9-4687-932c-d21f24fd4bff\") " Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.541124 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-kube-api-access-ftl7j" (OuterVolumeSpecName: "kube-api-access-ftl7j") pod "ea2e7e0f-aef9-4687-932c-d21f24fd4bff" (UID: "ea2e7e0f-aef9-4687-932c-d21f24fd4bff"). InnerVolumeSpecName "kube-api-access-ftl7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.544707 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-scripts" (OuterVolumeSpecName: "scripts") pod "ea2e7e0f-aef9-4687-932c-d21f24fd4bff" (UID: "ea2e7e0f-aef9-4687-932c-d21f24fd4bff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.545054 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab060da-8587-413a-a410-ee0e9cec40c6-kube-api-access-s774r" (OuterVolumeSpecName: "kube-api-access-s774r") pod "4ab060da-8587-413a-a410-ee0e9cec40c6" (UID: "4ab060da-8587-413a-a410-ee0e9cec40c6"). InnerVolumeSpecName "kube-api-access-s774r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.586281 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea2e7e0f-aef9-4687-932c-d21f24fd4bff" (UID: "ea2e7e0f-aef9-4687-932c-d21f24fd4bff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.604612 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-config-data" (OuterVolumeSpecName: "config-data") pod "ea2e7e0f-aef9-4687-932c-d21f24fd4bff" (UID: "ea2e7e0f-aef9-4687-932c-d21f24fd4bff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.623973 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ab060da-8587-413a-a410-ee0e9cec40c6" (UID: "4ab060da-8587-413a-a410-ee0e9cec40c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.627809 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ab060da-8587-413a-a410-ee0e9cec40c6" (UID: "4ab060da-8587-413a-a410-ee0e9cec40c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.628690 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ab060da-8587-413a-a410-ee0e9cec40c6" (UID: "4ab060da-8587-413a-a410-ee0e9cec40c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638521 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638558 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638572 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s774r\" (UniqueName: \"kubernetes.io/projected/4ab060da-8587-413a-a410-ee0e9cec40c6-kube-api-access-s774r\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638586 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638601 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638613 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftl7j\" (UniqueName: \"kubernetes.io/projected/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-kube-api-access-ftl7j\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638625 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.638635 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea2e7e0f-aef9-4687-932c-d21f24fd4bff-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.646514 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4ab060da-8587-413a-a410-ee0e9cec40c6" (UID: "4ab060da-8587-413a-a410-ee0e9cec40c6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.651629 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-config" (OuterVolumeSpecName: "config") pod "4ab060da-8587-413a-a410-ee0e9cec40c6" (UID: "4ab060da-8587-413a-a410-ee0e9cec40c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.740058 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.740097 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ab060da-8587-413a-a410-ee0e9cec40c6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.864467 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h5ch7" event={"ID":"ea2e7e0f-aef9-4687-932c-d21f24fd4bff","Type":"ContainerDied","Data":"68f7be70d41a760b9715eb4d90633fa6d0b418abb8ac102df2c4b1b0cee8fa86"} Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.864544 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f7be70d41a760b9715eb4d90633fa6d0b418abb8ac102df2c4b1b0cee8fa86" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.864618 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h5ch7" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.869637 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-2849v" event={"ID":"4ab060da-8587-413a-a410-ee0e9cec40c6","Type":"ContainerDied","Data":"894acb53cf18e87f7e2c3c3b36e872c3494f3f5477487f3a865082faee222b95"} Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.869688 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-2849v" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.869730 4772 scope.go:117] "RemoveContainer" containerID="b3b0580b2d9a989010c2055ae938a024c281531976b862414fc303ffddcf01e5" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.935467 4772 scope.go:117] "RemoveContainer" containerID="d9accb6fdf89d9e80604718e7ef1a89b03857412edcadace6f8fed5be8e5dfab" Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.946229 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2849v"] Jan 27 15:29:54 crc kubenswrapper[4772]: I0127 15:29:54.953920 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-2849v"] Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.078720 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.079471 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-log" containerID="cri-o://441883730f3253404e2e897259d3ab7a0d38be7920d10151b1f2a7d3a8e23ae9" gracePeriod=30 Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.080013 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-api" containerID="cri-o://b80017bd45d285abaa0954756e4b6dd746142e879b1c063421cc148826375061" gracePeriod=30 Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.096811 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.882296 4772 generic.go:334] "Generic (PLEG): container finished" podID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerID="441883730f3253404e2e897259d3ab7a0d38be7920d10151b1f2a7d3a8e23ae9" exitCode=143 Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.882377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a79db711-df56-46f6-93c2-7e1e5c914ba6","Type":"ContainerDied","Data":"441883730f3253404e2e897259d3ab7a0d38be7920d10151b1f2a7d3a8e23ae9"} Jan 27 15:29:55 crc kubenswrapper[4772]: I0127 15:29:55.883531 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2bfae3d5-7017-4ced-9691-4255769c51f6" containerName="nova-scheduler-scheduler" containerID="cri-o://63a9ec332e4c6035116beb1bf14c43d1297eb6d5bd2c76280996d80acf99516a" gracePeriod=30 Jan 27 15:29:56 crc kubenswrapper[4772]: I0127 15:29:56.674918 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" path="/var/lib/kubelet/pods/4ab060da-8587-413a-a410-ee0e9cec40c6/volumes" Jan 27 15:29:56 crc kubenswrapper[4772]: I0127 15:29:56.896835 4772 generic.go:334] "Generic (PLEG): container finished" podID="f91bfd1b-6386-444f-95da-045fbe957f5c" containerID="e1df482be0829e766abad6c9eb6842ba0e9d9f6fb517127a47f819ce8b296c7d" exitCode=0 Jan 27 15:29:56 crc kubenswrapper[4772]: I0127 15:29:56.896934 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" event={"ID":"f91bfd1b-6386-444f-95da-045fbe957f5c","Type":"ContainerDied","Data":"e1df482be0829e766abad6c9eb6842ba0e9d9f6fb517127a47f819ce8b296c7d"} Jan 27 15:29:56 crc kubenswrapper[4772]: I0127 15:29:56.899316 4772 generic.go:334] "Generic (PLEG): container finished" podID="2bfae3d5-7017-4ced-9691-4255769c51f6" containerID="63a9ec332e4c6035116beb1bf14c43d1297eb6d5bd2c76280996d80acf99516a" exitCode=0 Jan 27 15:29:56 crc kubenswrapper[4772]: I0127 15:29:56.899352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bfae3d5-7017-4ced-9691-4255769c51f6","Type":"ContainerDied","Data":"63a9ec332e4c6035116beb1bf14c43d1297eb6d5bd2c76280996d80acf99516a"} Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.183615 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.291507 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx5gw\" (UniqueName: \"kubernetes.io/projected/2bfae3d5-7017-4ced-9691-4255769c51f6-kube-api-access-cx5gw\") pod \"2bfae3d5-7017-4ced-9691-4255769c51f6\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.291610 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-combined-ca-bundle\") pod \"2bfae3d5-7017-4ced-9691-4255769c51f6\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.291632 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-config-data\") pod \"2bfae3d5-7017-4ced-9691-4255769c51f6\" (UID: \"2bfae3d5-7017-4ced-9691-4255769c51f6\") " Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.349666 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bfae3d5-7017-4ced-9691-4255769c51f6-kube-api-access-cx5gw" (OuterVolumeSpecName: "kube-api-access-cx5gw") pod "2bfae3d5-7017-4ced-9691-4255769c51f6" (UID: "2bfae3d5-7017-4ced-9691-4255769c51f6"). InnerVolumeSpecName "kube-api-access-cx5gw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.353245 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-config-data" (OuterVolumeSpecName: "config-data") pod "2bfae3d5-7017-4ced-9691-4255769c51f6" (UID: "2bfae3d5-7017-4ced-9691-4255769c51f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.356928 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bfae3d5-7017-4ced-9691-4255769c51f6" (UID: "2bfae3d5-7017-4ced-9691-4255769c51f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.398452 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx5gw\" (UniqueName: \"kubernetes.io/projected/2bfae3d5-7017-4ced-9691-4255769c51f6-kube-api-access-cx5gw\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.398658 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.398795 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bfae3d5-7017-4ced-9691-4255769c51f6-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.910298 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2bfae3d5-7017-4ced-9691-4255769c51f6","Type":"ContainerDied","Data":"1941b03ee46ef8231c28748227a7ca5084953c3a384610a809f7eb266c2d086a"} Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.910327 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.910362 4772 scope.go:117] "RemoveContainer" containerID="63a9ec332e4c6035116beb1bf14c43d1297eb6d5bd2c76280996d80acf99516a" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.954522 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.972397 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.986257 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:57 crc kubenswrapper[4772]: E0127 15:29:57.986803 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2e7e0f-aef9-4687-932c-d21f24fd4bff" containerName="nova-manage" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.986829 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2e7e0f-aef9-4687-932c-d21f24fd4bff" containerName="nova-manage" Jan 27 15:29:57 crc kubenswrapper[4772]: E0127 15:29:57.986849 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerName="dnsmasq-dns" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.986857 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerName="dnsmasq-dns" Jan 27 15:29:57 crc kubenswrapper[4772]: E0127 15:29:57.986868 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bfae3d5-7017-4ced-9691-4255769c51f6" containerName="nova-scheduler-scheduler" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.986875 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bfae3d5-7017-4ced-9691-4255769c51f6" containerName="nova-scheduler-scheduler" Jan 27 15:29:57 crc kubenswrapper[4772]: E0127 15:29:57.986888 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerName="init" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.986896 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerName="init" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.987126 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2e7e0f-aef9-4687-932c-d21f24fd4bff" containerName="nova-manage" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.987150 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ab060da-8587-413a-a410-ee0e9cec40c6" containerName="dnsmasq-dns" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.987188 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bfae3d5-7017-4ced-9691-4255769c51f6" containerName="nova-scheduler-scheduler" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.987942 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.997091 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:57 crc kubenswrapper[4772]: I0127 15:29:57.998762 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.111289 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-config-data\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.111455 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.111499 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt2mz\" (UniqueName: \"kubernetes.io/projected/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-kube-api-access-rt2mz\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.213513 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.213919 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt2mz\" (UniqueName: \"kubernetes.io/projected/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-kube-api-access-rt2mz\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.214002 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-config-data\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.220028 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.220206 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-config-data\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.232953 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt2mz\" (UniqueName: \"kubernetes.io/projected/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-kube-api-access-rt2mz\") pod \"nova-scheduler-0\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.308290 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.311053 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.427859 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-combined-ca-bundle\") pod \"f91bfd1b-6386-444f-95da-045fbe957f5c\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.427970 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nh52\" (UniqueName: \"kubernetes.io/projected/f91bfd1b-6386-444f-95da-045fbe957f5c-kube-api-access-7nh52\") pod \"f91bfd1b-6386-444f-95da-045fbe957f5c\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.428145 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-config-data\") pod \"f91bfd1b-6386-444f-95da-045fbe957f5c\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.428283 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-scripts\") pod \"f91bfd1b-6386-444f-95da-045fbe957f5c\" (UID: \"f91bfd1b-6386-444f-95da-045fbe957f5c\") " Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.435656 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-scripts" (OuterVolumeSpecName: "scripts") pod "f91bfd1b-6386-444f-95da-045fbe957f5c" (UID: "f91bfd1b-6386-444f-95da-045fbe957f5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.435720 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91bfd1b-6386-444f-95da-045fbe957f5c-kube-api-access-7nh52" (OuterVolumeSpecName: "kube-api-access-7nh52") pod "f91bfd1b-6386-444f-95da-045fbe957f5c" (UID: "f91bfd1b-6386-444f-95da-045fbe957f5c"). InnerVolumeSpecName "kube-api-access-7nh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.459548 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f91bfd1b-6386-444f-95da-045fbe957f5c" (UID: "f91bfd1b-6386-444f-95da-045fbe957f5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.459818 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-config-data" (OuterVolumeSpecName: "config-data") pod "f91bfd1b-6386-444f-95da-045fbe957f5c" (UID: "f91bfd1b-6386-444f-95da-045fbe957f5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.530917 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.531077 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.531087 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91bfd1b-6386-444f-95da-045fbe957f5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.531097 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nh52\" (UniqueName: \"kubernetes.io/projected/f91bfd1b-6386-444f-95da-045fbe957f5c-kube-api-access-7nh52\") on node \"crc\" DevicePath \"\"" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.693578 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bfae3d5-7017-4ced-9691-4255769c51f6" path="/var/lib/kubelet/pods/2bfae3d5-7017-4ced-9691-4255769c51f6/volumes" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.768975 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.933624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" event={"ID":"f91bfd1b-6386-444f-95da-045fbe957f5c","Type":"ContainerDied","Data":"a1515e80dd84c3d7eeadb4bf57668100c5ba9a6a2123be07618feb4932b00619"} Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.933662 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1515e80dd84c3d7eeadb4bf57668100c5ba9a6a2123be07618feb4932b00619" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.934894 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjwh2" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.935395 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a8fd83-de04-4458-8ef8-ebe7ae60194f","Type":"ContainerStarted","Data":"4d588204c22b2f28d3f4e69bf8dac95975b2db88030c7b5662a17b491819b8ab"} Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.996288 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 15:29:58 crc kubenswrapper[4772]: E0127 15:29:58.996762 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91bfd1b-6386-444f-95da-045fbe957f5c" containerName="nova-cell1-conductor-db-sync" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.996779 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91bfd1b-6386-444f-95da-045fbe957f5c" containerName="nova-cell1-conductor-db-sync" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.996991 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91bfd1b-6386-444f-95da-045fbe957f5c" containerName="nova-cell1-conductor-db-sync" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.997684 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:58 crc kubenswrapper[4772]: I0127 15:29:58.999384 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.019916 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.041246 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.041376 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.041428 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2cp\" (UniqueName: \"kubernetes.io/projected/dbbd3c83-3fde-4b11-8ef0-add837d393ce-kube-api-access-vh2cp\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.143403 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.143522 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.143556 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2cp\" (UniqueName: \"kubernetes.io/projected/dbbd3c83-3fde-4b11-8ef0-add837d393ce-kube-api-access-vh2cp\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.148986 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.150041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.164846 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2cp\" (UniqueName: \"kubernetes.io/projected/dbbd3c83-3fde-4b11-8ef0-add837d393ce-kube-api-access-vh2cp\") pod \"nova-cell1-conductor-0\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.366569 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.813352 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.957472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a8fd83-de04-4458-8ef8-ebe7ae60194f","Type":"ContainerStarted","Data":"caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e"} Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.958409 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dbbd3c83-3fde-4b11-8ef0-add837d393ce","Type":"ContainerStarted","Data":"4ac8efb7b8696b151d1bdc121a58850ad086edea13390d08276f3048f0eea493"} Jan 27 15:29:59 crc kubenswrapper[4772]: I0127 15:29:59.977498 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.977473201 podStartE2EDuration="2.977473201s" podCreationTimestamp="2026-01-27 15:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:29:59.9736025 +0000 UTC m=+1385.954211598" watchObservedRunningTime="2026-01-27 15:29:59.977473201 +0000 UTC m=+1385.958082299" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.137411 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6"] Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.138853 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.182854 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.183086 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.213199 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6"] Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.265864 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d103a19-1490-433a-abdb-3ebd279265f5-secret-volume\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.266242 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d103a19-1490-433a-abdb-3ebd279265f5-config-volume\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.266333 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6d103a19-1490-433a-abdb-3ebd279265f5-kube-api-access-qk6pv\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.368212 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d103a19-1490-433a-abdb-3ebd279265f5-secret-volume\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.368276 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d103a19-1490-433a-abdb-3ebd279265f5-config-volume\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.368364 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6d103a19-1490-433a-abdb-3ebd279265f5-kube-api-access-qk6pv\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.369397 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d103a19-1490-433a-abdb-3ebd279265f5-config-volume\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.376965 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d103a19-1490-433a-abdb-3ebd279265f5-secret-volume\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.401941 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6d103a19-1490-433a-abdb-3ebd279265f5-kube-api-access-qk6pv\") pod \"collect-profiles-29492130-7lgz6\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.513671 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.982197 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6"] Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.997016 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dbbd3c83-3fde-4b11-8ef0-add837d393ce","Type":"ContainerStarted","Data":"788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29"} Jan 27 15:30:00 crc kubenswrapper[4772]: I0127 15:30:00.997078 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.001132 4772 generic.go:334] "Generic (PLEG): container finished" podID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerID="b80017bd45d285abaa0954756e4b6dd746142e879b1c063421cc148826375061" exitCode=0 Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.003492 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a79db711-df56-46f6-93c2-7e1e5c914ba6","Type":"ContainerDied","Data":"b80017bd45d285abaa0954756e4b6dd746142e879b1c063421cc148826375061"} Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.028319 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.028298589 podStartE2EDuration="3.028298589s" podCreationTimestamp="2026-01-27 15:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:01.027699231 +0000 UTC m=+1387.008308339" watchObservedRunningTime="2026-01-27 15:30:01.028298589 +0000 UTC m=+1387.008907697" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.227681 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.292245 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2vn6\" (UniqueName: \"kubernetes.io/projected/a79db711-df56-46f6-93c2-7e1e5c914ba6-kube-api-access-t2vn6\") pod \"a79db711-df56-46f6-93c2-7e1e5c914ba6\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.292482 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79db711-df56-46f6-93c2-7e1e5c914ba6-logs\") pod \"a79db711-df56-46f6-93c2-7e1e5c914ba6\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.292511 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-config-data\") pod \"a79db711-df56-46f6-93c2-7e1e5c914ba6\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.292656 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-combined-ca-bundle\") pod \"a79db711-df56-46f6-93c2-7e1e5c914ba6\" (UID: \"a79db711-df56-46f6-93c2-7e1e5c914ba6\") " Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.293098 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a79db711-df56-46f6-93c2-7e1e5c914ba6-logs" (OuterVolumeSpecName: "logs") pod "a79db711-df56-46f6-93c2-7e1e5c914ba6" (UID: "a79db711-df56-46f6-93c2-7e1e5c914ba6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.293344 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a79db711-df56-46f6-93c2-7e1e5c914ba6-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.329309 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79db711-df56-46f6-93c2-7e1e5c914ba6-kube-api-access-t2vn6" (OuterVolumeSpecName: "kube-api-access-t2vn6") pod "a79db711-df56-46f6-93c2-7e1e5c914ba6" (UID: "a79db711-df56-46f6-93c2-7e1e5c914ba6"). InnerVolumeSpecName "kube-api-access-t2vn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.334197 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a79db711-df56-46f6-93c2-7e1e5c914ba6" (UID: "a79db711-df56-46f6-93c2-7e1e5c914ba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.334273 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-config-data" (OuterVolumeSpecName: "config-data") pod "a79db711-df56-46f6-93c2-7e1e5c914ba6" (UID: "a79db711-df56-46f6-93c2-7e1e5c914ba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.396460 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.396991 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2vn6\" (UniqueName: \"kubernetes.io/projected/a79db711-df56-46f6-93c2-7e1e5c914ba6-kube-api-access-t2vn6\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:01 crc kubenswrapper[4772]: I0127 15:30:01.397008 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a79db711-df56-46f6-93c2-7e1e5c914ba6-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.013344 4772 generic.go:334] "Generic (PLEG): container finished" podID="6d103a19-1490-433a-abdb-3ebd279265f5" containerID="58e0f9aeee1bc53c7d023bfdbaa2444440ab205390cfd9df2a1973966a2ae19f" exitCode=0 Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.013923 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" event={"ID":"6d103a19-1490-433a-abdb-3ebd279265f5","Type":"ContainerDied","Data":"58e0f9aeee1bc53c7d023bfdbaa2444440ab205390cfd9df2a1973966a2ae19f"} Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.013955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" event={"ID":"6d103a19-1490-433a-abdb-3ebd279265f5","Type":"ContainerStarted","Data":"5dc4066777884aea20f786297a419a7880d6948c75f7adb1580741a73f3871da"} Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.018030 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.019205 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a79db711-df56-46f6-93c2-7e1e5c914ba6","Type":"ContainerDied","Data":"ea5b86feefaf42849d7aeffa2e275c403579d3f00ecb406df1ac2ae06ceefa44"} Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.019280 4772 scope.go:117] "RemoveContainer" containerID="b80017bd45d285abaa0954756e4b6dd746142e879b1c063421cc148826375061" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.059508 4772 scope.go:117] "RemoveContainer" containerID="441883730f3253404e2e897259d3ab7a0d38be7920d10151b1f2a7d3a8e23ae9" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.109862 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.153008 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.170820 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:02 crc kubenswrapper[4772]: E0127 15:30:02.171617 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-api" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.171708 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-api" Jan 27 15:30:02 crc kubenswrapper[4772]: E0127 15:30:02.171897 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-log" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.171959 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-log" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.172259 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-api" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.172455 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" containerName="nova-api-log" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.173832 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.176542 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.182604 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.314523 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc238129-30ce-43ab-be89-045e2a9ae8e4-logs\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.314628 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-config-data\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.314766 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrvf\" (UniqueName: \"kubernetes.io/projected/dc238129-30ce-43ab-be89-045e2a9ae8e4-kube-api-access-lhrvf\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.314831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.418057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrvf\" (UniqueName: \"kubernetes.io/projected/dc238129-30ce-43ab-be89-045e2a9ae8e4-kube-api-access-lhrvf\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.418689 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.418773 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc238129-30ce-43ab-be89-045e2a9ae8e4-logs\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.418826 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-config-data\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.419902 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc238129-30ce-43ab-be89-045e2a9ae8e4-logs\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.425204 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.425416 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-config-data\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.448728 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrvf\" (UniqueName: \"kubernetes.io/projected/dc238129-30ce-43ab-be89-045e2a9ae8e4-kube-api-access-lhrvf\") pod \"nova-api-0\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.510506 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.681766 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a79db711-df56-46f6-93c2-7e1e5c914ba6" path="/var/lib/kubelet/pods/a79db711-df56-46f6-93c2-7e1e5c914ba6/volumes" Jan 27 15:30:02 crc kubenswrapper[4772]: I0127 15:30:02.967014 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.031337 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc238129-30ce-43ab-be89-045e2a9ae8e4","Type":"ContainerStarted","Data":"73f31dc4802f0c84c1162163b7afaf3ae7604a699c756292b15d9d1058c35cf9"} Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.311816 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.429073 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.548121 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d103a19-1490-433a-abdb-3ebd279265f5-config-volume\") pod \"6d103a19-1490-433a-abdb-3ebd279265f5\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.548160 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6d103a19-1490-433a-abdb-3ebd279265f5-kube-api-access-qk6pv\") pod \"6d103a19-1490-433a-abdb-3ebd279265f5\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.548361 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d103a19-1490-433a-abdb-3ebd279265f5-secret-volume\") pod \"6d103a19-1490-433a-abdb-3ebd279265f5\" (UID: \"6d103a19-1490-433a-abdb-3ebd279265f5\") " Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.549188 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d103a19-1490-433a-abdb-3ebd279265f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "6d103a19-1490-433a-abdb-3ebd279265f5" (UID: "6d103a19-1490-433a-abdb-3ebd279265f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.551849 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d103a19-1490-433a-abdb-3ebd279265f5-kube-api-access-qk6pv" (OuterVolumeSpecName: "kube-api-access-qk6pv") pod "6d103a19-1490-433a-abdb-3ebd279265f5" (UID: "6d103a19-1490-433a-abdb-3ebd279265f5"). InnerVolumeSpecName "kube-api-access-qk6pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.552871 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d103a19-1490-433a-abdb-3ebd279265f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6d103a19-1490-433a-abdb-3ebd279265f5" (UID: "6d103a19-1490-433a-abdb-3ebd279265f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.650763 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d103a19-1490-433a-abdb-3ebd279265f5-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.651108 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk6pv\" (UniqueName: \"kubernetes.io/projected/6d103a19-1490-433a-abdb-3ebd279265f5-kube-api-access-qk6pv\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:03 crc kubenswrapper[4772]: I0127 15:30:03.651123 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6d103a19-1490-433a-abdb-3ebd279265f5-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:04 crc kubenswrapper[4772]: I0127 15:30:04.043772 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc238129-30ce-43ab-be89-045e2a9ae8e4","Type":"ContainerStarted","Data":"ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f"} Jan 27 15:30:04 crc kubenswrapper[4772]: I0127 15:30:04.044815 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc238129-30ce-43ab-be89-045e2a9ae8e4","Type":"ContainerStarted","Data":"b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773"} Jan 27 15:30:04 crc kubenswrapper[4772]: I0127 15:30:04.047071 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" event={"ID":"6d103a19-1490-433a-abdb-3ebd279265f5","Type":"ContainerDied","Data":"5dc4066777884aea20f786297a419a7880d6948c75f7adb1580741a73f3871da"} Jan 27 15:30:04 crc kubenswrapper[4772]: I0127 15:30:04.047199 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dc4066777884aea20f786297a419a7880d6948c75f7adb1580741a73f3871da" Jan 27 15:30:04 crc kubenswrapper[4772]: I0127 15:30:04.047127 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6" Jan 27 15:30:04 crc kubenswrapper[4772]: I0127 15:30:04.481143 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.48109183 podStartE2EDuration="2.48109183s" podCreationTimestamp="2026-01-27 15:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:04.06414747 +0000 UTC m=+1390.044756588" watchObservedRunningTime="2026-01-27 15:30:04.48109183 +0000 UTC m=+1390.461700928" Jan 27 15:30:08 crc kubenswrapper[4772]: I0127 15:30:08.312070 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 15:30:08 crc kubenswrapper[4772]: I0127 15:30:08.338300 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 15:30:09 crc kubenswrapper[4772]: I0127 15:30:09.123509 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 15:30:09 crc kubenswrapper[4772]: I0127 15:30:09.469741 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 15:30:12 crc kubenswrapper[4772]: I0127 15:30:12.511112 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 15:30:12 crc kubenswrapper[4772]: I0127 15:30:12.511546 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 15:30:13 crc kubenswrapper[4772]: I0127 15:30:13.593428 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:13 crc kubenswrapper[4772]: I0127 15:30:13.593440 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:14 crc kubenswrapper[4772]: I0127 15:30:14.430690 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 15:30:18 crc kubenswrapper[4772]: W0127 15:30:18.819389 4772 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d103a19_1490_433a_abdb_3ebd279265f5.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d103a19_1490_433a_abdb_3ebd279265f5.slice: no such file or directory Jan 27 15:30:19 crc kubenswrapper[4772]: E0127 15:30:19.111006 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a218ca3_a163_4f9c_8e73_f630b2228bb2.slice/crio-conmon-dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a218ca3_a163_4f9c_8e73_f630b2228bb2.slice/crio-dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49.scope\": RecentStats: unable to find data in memory cache]" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.194645 4772 generic.go:334] "Generic (PLEG): container finished" podID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerID="f65fe58604f2e802f9b737bcbd1b97003be50f6c492456675251d85de107de5f" exitCode=137 Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.194713 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"325e0266-afde-43a7-b77c-4b29a2d55c3a","Type":"ContainerDied","Data":"f65fe58604f2e802f9b737bcbd1b97003be50f6c492456675251d85de107de5f"} Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.202089 4772 generic.go:334] "Generic (PLEG): container finished" podID="8a218ca3-a163-4f9c-8e73-f630b2228bb2" containerID="dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49" exitCode=137 Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.202140 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8a218ca3-a163-4f9c-8e73-f630b2228bb2","Type":"ContainerDied","Data":"dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49"} Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.202185 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8a218ca3-a163-4f9c-8e73-f630b2228bb2","Type":"ContainerDied","Data":"d661ce2ba7ef10af0b86479b7eca76159de7a8a82826e702dc8a584607cd118c"} Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.202198 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d661ce2ba7ef10af0b86479b7eca76159de7a8a82826e702dc8a584607cd118c" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.302238 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.312324 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.383861 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-config-data\") pod \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.383912 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325e0266-afde-43a7-b77c-4b29a2d55c3a-logs\") pod \"325e0266-afde-43a7-b77c-4b29a2d55c3a\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.383961 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-combined-ca-bundle\") pod \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.384029 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xml5\" (UniqueName: \"kubernetes.io/projected/325e0266-afde-43a7-b77c-4b29a2d55c3a-kube-api-access-6xml5\") pod \"325e0266-afde-43a7-b77c-4b29a2d55c3a\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.384060 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-config-data\") pod \"325e0266-afde-43a7-b77c-4b29a2d55c3a\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.384155 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-combined-ca-bundle\") pod \"325e0266-afde-43a7-b77c-4b29a2d55c3a\" (UID: \"325e0266-afde-43a7-b77c-4b29a2d55c3a\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.384205 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgjrt\" (UniqueName: \"kubernetes.io/projected/8a218ca3-a163-4f9c-8e73-f630b2228bb2-kube-api-access-hgjrt\") pod \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\" (UID: \"8a218ca3-a163-4f9c-8e73-f630b2228bb2\") " Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.385916 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325e0266-afde-43a7-b77c-4b29a2d55c3a-logs" (OuterVolumeSpecName: "logs") pod "325e0266-afde-43a7-b77c-4b29a2d55c3a" (UID: "325e0266-afde-43a7-b77c-4b29a2d55c3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.390107 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325e0266-afde-43a7-b77c-4b29a2d55c3a-kube-api-access-6xml5" (OuterVolumeSpecName: "kube-api-access-6xml5") pod "325e0266-afde-43a7-b77c-4b29a2d55c3a" (UID: "325e0266-afde-43a7-b77c-4b29a2d55c3a"). InnerVolumeSpecName "kube-api-access-6xml5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.391638 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a218ca3-a163-4f9c-8e73-f630b2228bb2-kube-api-access-hgjrt" (OuterVolumeSpecName: "kube-api-access-hgjrt") pod "8a218ca3-a163-4f9c-8e73-f630b2228bb2" (UID: "8a218ca3-a163-4f9c-8e73-f630b2228bb2"). InnerVolumeSpecName "kube-api-access-hgjrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.414452 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a218ca3-a163-4f9c-8e73-f630b2228bb2" (UID: "8a218ca3-a163-4f9c-8e73-f630b2228bb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.415838 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-config-data" (OuterVolumeSpecName: "config-data") pod "8a218ca3-a163-4f9c-8e73-f630b2228bb2" (UID: "8a218ca3-a163-4f9c-8e73-f630b2228bb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.418572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "325e0266-afde-43a7-b77c-4b29a2d55c3a" (UID: "325e0266-afde-43a7-b77c-4b29a2d55c3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.430559 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-config-data" (OuterVolumeSpecName: "config-data") pod "325e0266-afde-43a7-b77c-4b29a2d55c3a" (UID: "325e0266-afde-43a7-b77c-4b29a2d55c3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486207 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486261 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/325e0266-afde-43a7-b77c-4b29a2d55c3a-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486278 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a218ca3-a163-4f9c-8e73-f630b2228bb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486297 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xml5\" (UniqueName: \"kubernetes.io/projected/325e0266-afde-43a7-b77c-4b29a2d55c3a-kube-api-access-6xml5\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486309 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486329 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/325e0266-afde-43a7-b77c-4b29a2d55c3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:19 crc kubenswrapper[4772]: I0127 15:30:19.486340 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgjrt\" (UniqueName: \"kubernetes.io/projected/8a218ca3-a163-4f9c-8e73-f630b2228bb2-kube-api-access-hgjrt\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.212081 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"325e0266-afde-43a7-b77c-4b29a2d55c3a","Type":"ContainerDied","Data":"0fbb6b12d9611d132f4209f04274b5892a26d8ef07f59ee5c52d51242f78a833"} Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.212525 4772 scope.go:117] "RemoveContainer" containerID="f65fe58604f2e802f9b737bcbd1b97003be50f6c492456675251d85de107de5f" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.212109 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.212100 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.239897 4772 scope.go:117] "RemoveContainer" containerID="dea9ce36986750defb2b297fb876194a33a2f0aaee4b8e21995a37e01e57d98a" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.256701 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.283765 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.298598 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.312157 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: E0127 15:30:20.312695 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a218ca3-a163-4f9c-8e73-f630b2228bb2" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.312717 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a218ca3-a163-4f9c-8e73-f630b2228bb2" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 15:30:20 crc kubenswrapper[4772]: E0127 15:30:20.312739 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d103a19-1490-433a-abdb-3ebd279265f5" containerName="collect-profiles" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.312748 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d103a19-1490-433a-abdb-3ebd279265f5" containerName="collect-profiles" Jan 27 15:30:20 crc kubenswrapper[4772]: E0127 15:30:20.312773 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-metadata" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.312783 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-metadata" Jan 27 15:30:20 crc kubenswrapper[4772]: E0127 15:30:20.312812 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-log" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.312819 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-log" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.313025 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d103a19-1490-433a-abdb-3ebd279265f5" containerName="collect-profiles" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.313053 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-log" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.313080 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" containerName="nova-metadata-metadata" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.313093 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a218ca3-a163-4f9c-8e73-f630b2228bb2" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.315497 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.317465 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.323405 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.324367 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.334772 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.346283 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.347823 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.350585 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.352436 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.352645 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.365292 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404092 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-config-data\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404128 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404339 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404609 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ttx\" (UniqueName: \"kubernetes.io/projected/5e69643a-e8c2-4057-a993-d5506ceeec1b-kube-api-access-85ttx\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404709 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3293f51d-380b-4247-b1ca-5d1f4b831e52-logs\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404812 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404857 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6whx\" (UniqueName: \"kubernetes.io/projected/3293f51d-380b-4247-b1ca-5d1f4b831e52-kube-api-access-v6whx\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404889 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.404925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.405001 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.506746 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ttx\" (UniqueName: \"kubernetes.io/projected/5e69643a-e8c2-4057-a993-d5506ceeec1b-kube-api-access-85ttx\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.506830 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3293f51d-380b-4247-b1ca-5d1f4b831e52-logs\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.506880 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.506912 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6whx\" (UniqueName: \"kubernetes.io/projected/3293f51d-380b-4247-b1ca-5d1f4b831e52-kube-api-access-v6whx\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507436 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3293f51d-380b-4247-b1ca-5d1f4b831e52-logs\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507792 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507824 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507860 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507901 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-config-data\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507915 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.507936 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.512514 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.514860 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-config-data\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.515125 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.515305 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.515704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.521545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.527759 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.528515 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ttx\" (UniqueName: \"kubernetes.io/projected/5e69643a-e8c2-4057-a993-d5506ceeec1b-kube-api-access-85ttx\") pod \"nova-cell1-novncproxy-0\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.529721 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6whx\" (UniqueName: \"kubernetes.io/projected/3293f51d-380b-4247-b1ca-5d1f4b831e52-kube-api-access-v6whx\") pod \"nova-metadata-0\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.639068 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.674451 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.676708 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325e0266-afde-43a7-b77c-4b29a2d55c3a" path="/var/lib/kubelet/pods/325e0266-afde-43a7-b77c-4b29a2d55c3a/volumes" Jan 27 15:30:20 crc kubenswrapper[4772]: I0127 15:30:20.677852 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a218ca3-a163-4f9c-8e73-f630b2228bb2" path="/var/lib/kubelet/pods/8a218ca3-a163-4f9c-8e73-f630b2228bb2/volumes" Jan 27 15:30:21 crc kubenswrapper[4772]: I0127 15:30:21.151877 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:21 crc kubenswrapper[4772]: I0127 15:30:21.248473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3293f51d-380b-4247-b1ca-5d1f4b831e52","Type":"ContainerStarted","Data":"bc2f5b8265c22782479449303b876d41140ba5bbf29e25aebf07297950204c86"} Jan 27 15:30:21 crc kubenswrapper[4772]: I0127 15:30:21.272458 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:30:21 crc kubenswrapper[4772]: W0127 15:30:21.273480 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e69643a_e8c2_4057_a993_d5506ceeec1b.slice/crio-6444b6c25763e568fb1ce306052e7e5dc898559abb4fc82fb393de7a9f4a2b66 WatchSource:0}: Error finding container 6444b6c25763e568fb1ce306052e7e5dc898559abb4fc82fb393de7a9f4a2b66: Status 404 returned error can't find the container with id 6444b6c25763e568fb1ce306052e7e5dc898559abb4fc82fb393de7a9f4a2b66 Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.264905 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3293f51d-380b-4247-b1ca-5d1f4b831e52","Type":"ContainerStarted","Data":"5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2"} Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.265531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3293f51d-380b-4247-b1ca-5d1f4b831e52","Type":"ContainerStarted","Data":"2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba"} Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.267546 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e69643a-e8c2-4057-a993-d5506ceeec1b","Type":"ContainerStarted","Data":"7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce"} Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.267586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e69643a-e8c2-4057-a993-d5506ceeec1b","Type":"ContainerStarted","Data":"6444b6c25763e568fb1ce306052e7e5dc898559abb4fc82fb393de7a9f4a2b66"} Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.314033 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.314010716 podStartE2EDuration="2.314010716s" podCreationTimestamp="2026-01-27 15:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:22.2854577 +0000 UTC m=+1408.266066818" watchObservedRunningTime="2026-01-27 15:30:22.314010716 +0000 UTC m=+1408.294619814" Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.314379 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.314371656 podStartE2EDuration="2.314371656s" podCreationTimestamp="2026-01-27 15:30:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:22.308452085 +0000 UTC m=+1408.289061183" watchObservedRunningTime="2026-01-27 15:30:22.314371656 +0000 UTC m=+1408.294980764" Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.515491 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.516109 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.518094 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 15:30:22 crc kubenswrapper[4772]: I0127 15:30:22.518692 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.277449 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.282454 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.475138 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-2hd4f"] Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.476753 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.502241 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-2hd4f"] Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.579500 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.579571 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.579609 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fqp8\" (UniqueName: \"kubernetes.io/projected/fddd5e59-3124-4a05-aafd-92d6aea05f7e-kube-api-access-6fqp8\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.579650 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.579872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.580152 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-config\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.682609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.682730 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-config\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.682793 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.682835 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.682881 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fqp8\" (UniqueName: \"kubernetes.io/projected/fddd5e59-3124-4a05-aafd-92d6aea05f7e-kube-api-access-6fqp8\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.682932 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.684049 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.684706 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.685370 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-config\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.686007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.686956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.716209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fqp8\" (UniqueName: \"kubernetes.io/projected/fddd5e59-3124-4a05-aafd-92d6aea05f7e-kube-api-access-6fqp8\") pod \"dnsmasq-dns-59cf4bdb65-2hd4f\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:23 crc kubenswrapper[4772]: I0127 15:30:23.804429 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:24 crc kubenswrapper[4772]: W0127 15:30:24.277762 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfddd5e59_3124_4a05_aafd_92d6aea05f7e.slice/crio-c5c94af58b0cd6c043cac9ed46da0616cb74fd66aa5279858fb42cf515ba3aa1 WatchSource:0}: Error finding container c5c94af58b0cd6c043cac9ed46da0616cb74fd66aa5279858fb42cf515ba3aa1: Status 404 returned error can't find the container with id c5c94af58b0cd6c043cac9ed46da0616cb74fd66aa5279858fb42cf515ba3aa1 Jan 27 15:30:24 crc kubenswrapper[4772]: I0127 15:30:24.279835 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-2hd4f"] Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.304134 4772 generic.go:334] "Generic (PLEG): container finished" podID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerID="71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500" exitCode=0 Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.304217 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" event={"ID":"fddd5e59-3124-4a05-aafd-92d6aea05f7e","Type":"ContainerDied","Data":"71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500"} Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.304711 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" event={"ID":"fddd5e59-3124-4a05-aafd-92d6aea05f7e","Type":"ContainerStarted","Data":"c5c94af58b0cd6c043cac9ed46da0616cb74fd66aa5279858fb42cf515ba3aa1"} Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.589340 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.590276 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-central-agent" containerID="cri-o://7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed" gracePeriod=30 Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.590642 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="sg-core" containerID="cri-o://a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8" gracePeriod=30 Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.590675 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-notification-agent" containerID="cri-o://20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504" gracePeriod=30 Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.590682 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="proxy-httpd" containerID="cri-o://a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481" gracePeriod=30 Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.639666 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.641097 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 15:30:25 crc kubenswrapper[4772]: I0127 15:30:25.675157 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.314264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" event={"ID":"fddd5e59-3124-4a05-aafd-92d6aea05f7e","Type":"ContainerStarted","Data":"0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86"} Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.315606 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.319693 4772 generic.go:334] "Generic (PLEG): container finished" podID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerID="a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481" exitCode=0 Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.319722 4772 generic.go:334] "Generic (PLEG): container finished" podID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerID="a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8" exitCode=2 Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.319732 4772 generic.go:334] "Generic (PLEG): container finished" podID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerID="7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed" exitCode=0 Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.319869 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerDied","Data":"a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481"} Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.319899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerDied","Data":"a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8"} Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.319914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerDied","Data":"7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed"} Jan 27 15:30:26 crc kubenswrapper[4772]: I0127 15:30:26.339814 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" podStartSLOduration=3.339794555 podStartE2EDuration="3.339794555s" podCreationTimestamp="2026-01-27 15:30:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:26.337914971 +0000 UTC m=+1412.318524089" watchObservedRunningTime="2026-01-27 15:30:26.339794555 +0000 UTC m=+1412.320403653" Jan 27 15:30:27 crc kubenswrapper[4772]: I0127 15:30:27.405596 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:27 crc kubenswrapper[4772]: I0127 15:30:27.406210 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-log" containerID="cri-o://b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773" gracePeriod=30 Jan 27 15:30:27 crc kubenswrapper[4772]: I0127 15:30:27.406312 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-api" containerID="cri-o://ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f" gracePeriod=30 Jan 27 15:30:28 crc kubenswrapper[4772]: I0127 15:30:28.343598 4772 generic.go:334] "Generic (PLEG): container finished" podID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerID="b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773" exitCode=143 Jan 27 15:30:28 crc kubenswrapper[4772]: I0127 15:30:28.344282 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc238129-30ce-43ab-be89-045e2a9ae8e4","Type":"ContainerDied","Data":"b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773"} Jan 27 15:30:30 crc kubenswrapper[4772]: I0127 15:30:30.639634 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 15:30:30 crc kubenswrapper[4772]: I0127 15:30:30.640028 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 15:30:30 crc kubenswrapper[4772]: I0127 15:30:30.675022 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:30 crc kubenswrapper[4772]: I0127 15:30:30.694136 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.090490 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.245009 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-config-data\") pod \"dc238129-30ce-43ab-be89-045e2a9ae8e4\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.245072 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhrvf\" (UniqueName: \"kubernetes.io/projected/dc238129-30ce-43ab-be89-045e2a9ae8e4-kube-api-access-lhrvf\") pod \"dc238129-30ce-43ab-be89-045e2a9ae8e4\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.245183 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc238129-30ce-43ab-be89-045e2a9ae8e4-logs\") pod \"dc238129-30ce-43ab-be89-045e2a9ae8e4\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.245241 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-combined-ca-bundle\") pod \"dc238129-30ce-43ab-be89-045e2a9ae8e4\" (UID: \"dc238129-30ce-43ab-be89-045e2a9ae8e4\") " Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.246916 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc238129-30ce-43ab-be89-045e2a9ae8e4-logs" (OuterVolumeSpecName: "logs") pod "dc238129-30ce-43ab-be89-045e2a9ae8e4" (UID: "dc238129-30ce-43ab-be89-045e2a9ae8e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.270319 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc238129-30ce-43ab-be89-045e2a9ae8e4-kube-api-access-lhrvf" (OuterVolumeSpecName: "kube-api-access-lhrvf") pod "dc238129-30ce-43ab-be89-045e2a9ae8e4" (UID: "dc238129-30ce-43ab-be89-045e2a9ae8e4"). InnerVolumeSpecName "kube-api-access-lhrvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.283882 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc238129-30ce-43ab-be89-045e2a9ae8e4" (UID: "dc238129-30ce-43ab-be89-045e2a9ae8e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.306502 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-config-data" (OuterVolumeSpecName: "config-data") pod "dc238129-30ce-43ab-be89-045e2a9ae8e4" (UID: "dc238129-30ce-43ab-be89-045e2a9ae8e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.347729 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.348165 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhrvf\" (UniqueName: \"kubernetes.io/projected/dc238129-30ce-43ab-be89-045e2a9ae8e4-kube-api-access-lhrvf\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.348203 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc238129-30ce-43ab-be89-045e2a9ae8e4-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.348216 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc238129-30ce-43ab-be89-045e2a9ae8e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.374915 4772 generic.go:334] "Generic (PLEG): container finished" podID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerID="ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f" exitCode=0 Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.375000 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.375040 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc238129-30ce-43ab-be89-045e2a9ae8e4","Type":"ContainerDied","Data":"ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f"} Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.375070 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc238129-30ce-43ab-be89-045e2a9ae8e4","Type":"ContainerDied","Data":"73f31dc4802f0c84c1162163b7afaf3ae7604a699c756292b15d9d1058c35cf9"} Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.375087 4772 scope.go:117] "RemoveContainer" containerID="ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.396149 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.403324 4772 scope.go:117] "RemoveContainer" containerID="b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.435965 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.457268 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.487310 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:31 crc kubenswrapper[4772]: E0127 15:30:31.489895 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-api" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.489928 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-api" Jan 27 15:30:31 crc kubenswrapper[4772]: E0127 15:30:31.489949 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-log" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.489955 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-log" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.490498 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-api" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.490513 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" containerName="nova-api-log" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.493737 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.497280 4772 scope.go:117] "RemoveContainer" containerID="ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.497550 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.497577 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.497603 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 15:30:31 crc kubenswrapper[4772]: E0127 15:30:31.499681 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f\": container with ID starting with ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f not found: ID does not exist" containerID="ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.499755 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f"} err="failed to get container status \"ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f\": rpc error: code = NotFound desc = could not find container \"ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f\": container with ID starting with ce0174bb725abb785d0b27fe956a5f1ba8818f4c6e84ec3b0dd1b5155087cf7f not found: ID does not exist" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.499785 4772 scope.go:117] "RemoveContainer" containerID="b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773" Jan 27 15:30:31 crc kubenswrapper[4772]: E0127 15:30:31.501516 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773\": container with ID starting with b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773 not found: ID does not exist" containerID="b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.511929 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773"} err="failed to get container status \"b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773\": rpc error: code = NotFound desc = could not find container \"b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773\": container with ID starting with b83e870244cf87f2c8284ca6688ebf8db7da16dcd1b693d6c904bb5747128773 not found: ID does not exist" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.528683 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.587244 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dr2w8"] Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.588947 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.592887 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.594797 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.599966 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dr2w8"] Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.652406 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.652431 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.658950 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjcn\" (UniqueName: \"kubernetes.io/projected/2d10501c-aefc-4b6b-934a-cd53db7aa029-kube-api-access-4xjcn\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659106 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/838e8a63-fd3f-4f03-9030-a2c3a4db7393-logs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-scripts\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659191 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwgqb\" (UniqueName: \"kubernetes.io/projected/838e8a63-fd3f-4f03-9030-a2c3a4db7393-kube-api-access-kwgqb\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-internal-tls-certs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659278 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-config-data\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-config-data\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659367 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-public-tls-certs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.659440 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.760937 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/838e8a63-fd3f-4f03-9030-a2c3a4db7393-logs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.760992 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-scripts\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.761019 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwgqb\" (UniqueName: \"kubernetes.io/projected/838e8a63-fd3f-4f03-9030-a2c3a4db7393-kube-api-access-kwgqb\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.761475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/838e8a63-fd3f-4f03-9030-a2c3a4db7393-logs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-internal-tls-certs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-config-data\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762398 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-config-data\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762459 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-public-tls-certs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762483 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762539 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.762570 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjcn\" (UniqueName: \"kubernetes.io/projected/2d10501c-aefc-4b6b-934a-cd53db7aa029-kube-api-access-4xjcn\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.767110 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-public-tls-certs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.767323 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-scripts\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.767820 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-internal-tls-certs\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.769008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.769381 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-config-data\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.772853 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.776967 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-config-data\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.782115 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjcn\" (UniqueName: \"kubernetes.io/projected/2d10501c-aefc-4b6b-934a-cd53db7aa029-kube-api-access-4xjcn\") pod \"nova-cell1-cell-mapping-dr2w8\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.794700 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwgqb\" (UniqueName: \"kubernetes.io/projected/838e8a63-fd3f-4f03-9030-a2c3a4db7393-kube-api-access-kwgqb\") pod \"nova-api-0\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.829515 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:31 crc kubenswrapper[4772]: I0127 15:30:31.909778 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:32 crc kubenswrapper[4772]: I0127 15:30:32.369864 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:32 crc kubenswrapper[4772]: I0127 15:30:32.390377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"838e8a63-fd3f-4f03-9030-a2c3a4db7393","Type":"ContainerStarted","Data":"f7d96b6a64da6759e6a7a8bc3df6aebfecfa7e5ec0b668d726902cda154a1888"} Jan 27 15:30:32 crc kubenswrapper[4772]: W0127 15:30:32.457470 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d10501c_aefc_4b6b_934a_cd53db7aa029.slice/crio-33208e54b5b97ccefb163c65dd3c0ecf279de8f0faa126d781f9e471398f64fd WatchSource:0}: Error finding container 33208e54b5b97ccefb163c65dd3c0ecf279de8f0faa126d781f9e471398f64fd: Status 404 returned error can't find the container with id 33208e54b5b97ccefb163c65dd3c0ecf279de8f0faa126d781f9e471398f64fd Jan 27 15:30:32 crc kubenswrapper[4772]: I0127 15:30:32.461899 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dr2w8"] Jan 27 15:30:32 crc kubenswrapper[4772]: I0127 15:30:32.674004 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc238129-30ce-43ab-be89-045e2a9ae8e4" path="/var/lib/kubelet/pods/dc238129-30ce-43ab-be89-045e2a9ae8e4/volumes" Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.401628 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dr2w8" event={"ID":"2d10501c-aefc-4b6b-934a-cd53db7aa029","Type":"ContainerStarted","Data":"7c4a200cbf0e299c55e6a175696503b17a34325960db8f2addd09db07bdebe00"} Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.401989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dr2w8" event={"ID":"2d10501c-aefc-4b6b-934a-cd53db7aa029","Type":"ContainerStarted","Data":"33208e54b5b97ccefb163c65dd3c0ecf279de8f0faa126d781f9e471398f64fd"} Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.406267 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"838e8a63-fd3f-4f03-9030-a2c3a4db7393","Type":"ContainerStarted","Data":"9356a4887a357fc1448800688f7032704f830e8c9f7df6f0b3fe3c97ddfa3bb7"} Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.424618 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dr2w8" podStartSLOduration=2.424594896 podStartE2EDuration="2.424594896s" podCreationTimestamp="2026-01-27 15:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:33.419064116 +0000 UTC m=+1419.399673214" watchObservedRunningTime="2026-01-27 15:30:33.424594896 +0000 UTC m=+1419.405203994" Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.806000 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.882847 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-88t2p"] Jan 27 15:30:33 crc kubenswrapper[4772]: I0127 15:30:33.884259 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerName="dnsmasq-dns" containerID="cri-o://e6ff45d04539aabc690d9ba73108fbdca7b2759433b3c528ba073567c047da4f" gracePeriod=10 Jan 27 15:30:34 crc kubenswrapper[4772]: I0127 15:30:34.418151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"838e8a63-fd3f-4f03-9030-a2c3a4db7393","Type":"ContainerStarted","Data":"596e46cd823d75132da321dc3e1e49f4c351f7ee177713e0655112d945b2390c"} Jan 27 15:30:34 crc kubenswrapper[4772]: I0127 15:30:34.420934 4772 generic.go:334] "Generic (PLEG): container finished" podID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerID="e6ff45d04539aabc690d9ba73108fbdca7b2759433b3c528ba073567c047da4f" exitCode=0 Jan 27 15:30:34 crc kubenswrapper[4772]: I0127 15:30:34.421146 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" event={"ID":"73ee81ee-57fa-466a-8ada-2fa4da5987a0","Type":"ContainerDied","Data":"e6ff45d04539aabc690d9ba73108fbdca7b2759433b3c528ba073567c047da4f"} Jan 27 15:30:34 crc kubenswrapper[4772]: I0127 15:30:34.440662 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.4406461 podStartE2EDuration="3.4406461s" podCreationTimestamp="2026-01-27 15:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:34.435905663 +0000 UTC m=+1420.416514761" watchObservedRunningTime="2026-01-27 15:30:34.4406461 +0000 UTC m=+1420.421255198" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.005984 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.043792 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-sb\") pod \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.043971 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf2fc\" (UniqueName: \"kubernetes.io/projected/73ee81ee-57fa-466a-8ada-2fa4da5987a0-kube-api-access-vf2fc\") pod \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.044038 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-swift-storage-0\") pod \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.044105 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-svc\") pod \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.044133 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-config\") pod \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.044182 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-nb\") pod \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\" (UID: \"73ee81ee-57fa-466a-8ada-2fa4da5987a0\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.050827 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ee81ee-57fa-466a-8ada-2fa4da5987a0-kube-api-access-vf2fc" (OuterVolumeSpecName: "kube-api-access-vf2fc") pod "73ee81ee-57fa-466a-8ada-2fa4da5987a0" (UID: "73ee81ee-57fa-466a-8ada-2fa4da5987a0"). InnerVolumeSpecName "kube-api-access-vf2fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.104618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73ee81ee-57fa-466a-8ada-2fa4da5987a0" (UID: "73ee81ee-57fa-466a-8ada-2fa4da5987a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.112789 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73ee81ee-57fa-466a-8ada-2fa4da5987a0" (UID: "73ee81ee-57fa-466a-8ada-2fa4da5987a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.114299 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-config" (OuterVolumeSpecName: "config") pod "73ee81ee-57fa-466a-8ada-2fa4da5987a0" (UID: "73ee81ee-57fa-466a-8ada-2fa4da5987a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.118953 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73ee81ee-57fa-466a-8ada-2fa4da5987a0" (UID: "73ee81ee-57fa-466a-8ada-2fa4da5987a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.147511 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.147547 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.147556 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.147567 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.147575 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf2fc\" (UniqueName: \"kubernetes.io/projected/73ee81ee-57fa-466a-8ada-2fa4da5987a0-kube-api-access-vf2fc\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.151413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73ee81ee-57fa-466a-8ada-2fa4da5987a0" (UID: "73ee81ee-57fa-466a-8ada-2fa4da5987a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.235299 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248193 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-run-httpd\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248253 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-log-httpd\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248304 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br29b\" (UniqueName: \"kubernetes.io/projected/445d3e38-8f68-4dad-9e97-d927d60ee1e4-kube-api-access-br29b\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248346 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-scripts\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248369 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-combined-ca-bundle\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248399 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-ceilometer-tls-certs\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248483 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-sg-core-conf-yaml\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248516 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-config-data\") pod \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\" (UID: \"445d3e38-8f68-4dad-9e97-d927d60ee1e4\") " Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.248869 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73ee81ee-57fa-466a-8ada-2fa4da5987a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.249880 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.249990 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.256352 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/445d3e38-8f68-4dad-9e97-d927d60ee1e4-kube-api-access-br29b" (OuterVolumeSpecName: "kube-api-access-br29b") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "kube-api-access-br29b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.257333 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-scripts" (OuterVolumeSpecName: "scripts") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.297571 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.343341 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.354116 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.354152 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.354233 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/445d3e38-8f68-4dad-9e97-d927d60ee1e4-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.354245 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br29b\" (UniqueName: \"kubernetes.io/projected/445d3e38-8f68-4dad-9e97-d927d60ee1e4-kube-api-access-br29b\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.354255 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.354264 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.366567 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.397811 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-config-data" (OuterVolumeSpecName: "config-data") pod "445d3e38-8f68-4dad-9e97-d927d60ee1e4" (UID: "445d3e38-8f68-4dad-9e97-d927d60ee1e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.431361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" event={"ID":"73ee81ee-57fa-466a-8ada-2fa4da5987a0","Type":"ContainerDied","Data":"79cc249e145e1f853047b2402d998a0aa111ca080edd461a766047221e313d63"} Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.431378 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-88t2p" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.431425 4772 scope.go:117] "RemoveContainer" containerID="e6ff45d04539aabc690d9ba73108fbdca7b2759433b3c528ba073567c047da4f" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.444849 4772 generic.go:334] "Generic (PLEG): container finished" podID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerID="20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504" exitCode=0 Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.444909 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.444956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerDied","Data":"20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504"} Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.444989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"445d3e38-8f68-4dad-9e97-d927d60ee1e4","Type":"ContainerDied","Data":"5f55b8adcf12a6b66983b6beb58f8b720085a1c824b9e5e763f7ea5a2b511df2"} Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.463555 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.463588 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/445d3e38-8f68-4dad-9e97-d927d60ee1e4-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.479549 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-88t2p"] Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.488142 4772 scope.go:117] "RemoveContainer" containerID="4148bc33036bbf1369f8777e53357b14d6ea084f15c60f06da4d4195151a1ddd" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.492971 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-88t2p"] Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.516139 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.538767 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.558575 4772 scope.go:117] "RemoveContainer" containerID="a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.564637 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.565122 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerName="init" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565145 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerName="init" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.565159 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="sg-core" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565184 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="sg-core" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.565209 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-central-agent" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565217 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-central-agent" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.565237 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerName="dnsmasq-dns" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565246 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerName="dnsmasq-dns" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.565259 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="proxy-httpd" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565266 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="proxy-httpd" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.565281 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-notification-agent" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565289 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-notification-agent" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565539 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" containerName="dnsmasq-dns" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565562 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-central-agent" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565573 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="ceilometer-notification-agent" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565582 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="sg-core" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.565602 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" containerName="proxy-httpd" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.568819 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.570642 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.574070 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.574337 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.579370 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.592100 4772 scope.go:117] "RemoveContainer" containerID="a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.675515 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-config-data\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.675923 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jk7\" (UniqueName: \"kubernetes.io/projected/aea5ee43-36e3-437d-8aca-b2faedd87c5b-kube-api-access-72jk7\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.676015 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.676052 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.676250 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.676280 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.676344 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-scripts\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.676534 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.689347 4772 scope.go:117] "RemoveContainer" containerID="20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.754241 4772 scope.go:117] "RemoveContainer" containerID="7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.772787 4772 scope.go:117] "RemoveContainer" containerID="a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.773205 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481\": container with ID starting with a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481 not found: ID does not exist" containerID="a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.773248 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481"} err="failed to get container status \"a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481\": rpc error: code = NotFound desc = could not find container \"a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481\": container with ID starting with a2f054d9dabda4ad71c0c80285f0ce4245d4a9aa17475b812ef0cd6382dca481 not found: ID does not exist" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.773268 4772 scope.go:117] "RemoveContainer" containerID="a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.773613 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8\": container with ID starting with a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8 not found: ID does not exist" containerID="a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.773636 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8"} err="failed to get container status \"a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8\": rpc error: code = NotFound desc = could not find container \"a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8\": container with ID starting with a36cda53da6b26fa4dc152f0abf33d17f95ff93ab0b84474109c123bbd0176a8 not found: ID does not exist" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.773652 4772 scope.go:117] "RemoveContainer" containerID="20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.773868 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504\": container with ID starting with 20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504 not found: ID does not exist" containerID="20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.773892 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504"} err="failed to get container status \"20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504\": rpc error: code = NotFound desc = could not find container \"20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504\": container with ID starting with 20892681e030eeb363de84f7efb9def934d634922b8bfc86ec12ed76e131d504 not found: ID does not exist" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.773907 4772 scope.go:117] "RemoveContainer" containerID="7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed" Jan 27 15:30:35 crc kubenswrapper[4772]: E0127 15:30:35.774249 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed\": container with ID starting with 7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed not found: ID does not exist" containerID="7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.774388 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed"} err="failed to get container status \"7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed\": rpc error: code = NotFound desc = could not find container \"7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed\": container with ID starting with 7d804eabf1929c1b2bbdc21b8685f4ea2301f718b175dd856db56adb377cd8ed not found: ID does not exist" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778437 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778496 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-config-data\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jk7\" (UniqueName: \"kubernetes.io/projected/aea5ee43-36e3-437d-8aca-b2faedd87c5b-kube-api-access-72jk7\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778555 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778590 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778642 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.778692 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-scripts\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.780505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-log-httpd\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.780720 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-run-httpd\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.782981 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.783719 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-scripts\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.783951 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.785343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-config-data\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.785600 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.796639 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jk7\" (UniqueName: \"kubernetes.io/projected/aea5ee43-36e3-437d-8aca-b2faedd87c5b-kube-api-access-72jk7\") pod \"ceilometer-0\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " pod="openstack/ceilometer-0" Jan 27 15:30:35 crc kubenswrapper[4772]: I0127 15:30:35.888529 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:30:36 crc kubenswrapper[4772]: I0127 15:30:36.369095 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:30:36 crc kubenswrapper[4772]: W0127 15:30:36.375537 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaea5ee43_36e3_437d_8aca_b2faedd87c5b.slice/crio-0d9e1d64ee2212bcbce9b483a76517d64478f416567bc79c87cd9fc874d3b4e1 WatchSource:0}: Error finding container 0d9e1d64ee2212bcbce9b483a76517d64478f416567bc79c87cd9fc874d3b4e1: Status 404 returned error can't find the container with id 0d9e1d64ee2212bcbce9b483a76517d64478f416567bc79c87cd9fc874d3b4e1 Jan 27 15:30:36 crc kubenswrapper[4772]: I0127 15:30:36.458913 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerStarted","Data":"0d9e1d64ee2212bcbce9b483a76517d64478f416567bc79c87cd9fc874d3b4e1"} Jan 27 15:30:36 crc kubenswrapper[4772]: I0127 15:30:36.677028 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="445d3e38-8f68-4dad-9e97-d927d60ee1e4" path="/var/lib/kubelet/pods/445d3e38-8f68-4dad-9e97-d927d60ee1e4/volumes" Jan 27 15:30:36 crc kubenswrapper[4772]: I0127 15:30:36.678183 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ee81ee-57fa-466a-8ada-2fa4da5987a0" path="/var/lib/kubelet/pods/73ee81ee-57fa-466a-8ada-2fa4da5987a0/volumes" Jan 27 15:30:38 crc kubenswrapper[4772]: I0127 15:30:38.481407 4772 generic.go:334] "Generic (PLEG): container finished" podID="2d10501c-aefc-4b6b-934a-cd53db7aa029" containerID="7c4a200cbf0e299c55e6a175696503b17a34325960db8f2addd09db07bdebe00" exitCode=0 Jan 27 15:30:38 crc kubenswrapper[4772]: I0127 15:30:38.481511 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dr2w8" event={"ID":"2d10501c-aefc-4b6b-934a-cd53db7aa029","Type":"ContainerDied","Data":"7c4a200cbf0e299c55e6a175696503b17a34325960db8f2addd09db07bdebe00"} Jan 27 15:30:38 crc kubenswrapper[4772]: I0127 15:30:38.485103 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerStarted","Data":"a4293d3cbd138216987430f5dab62fa26e55c56743eee0b42dd4fc7797a52afd"} Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.506514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerStarted","Data":"81bb10c06283521cef14702be02bc4e89a7f82e4ae6c7d56b76d0d05f92797d0"} Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.906622 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.953877 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-combined-ca-bundle\") pod \"2d10501c-aefc-4b6b-934a-cd53db7aa029\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.954514 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-config-data\") pod \"2d10501c-aefc-4b6b-934a-cd53db7aa029\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.954572 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-scripts\") pod \"2d10501c-aefc-4b6b-934a-cd53db7aa029\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.954714 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjcn\" (UniqueName: \"kubernetes.io/projected/2d10501c-aefc-4b6b-934a-cd53db7aa029-kube-api-access-4xjcn\") pod \"2d10501c-aefc-4b6b-934a-cd53db7aa029\" (UID: \"2d10501c-aefc-4b6b-934a-cd53db7aa029\") " Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.970531 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-scripts" (OuterVolumeSpecName: "scripts") pod "2d10501c-aefc-4b6b-934a-cd53db7aa029" (UID: "2d10501c-aefc-4b6b-934a-cd53db7aa029"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.987205 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d10501c-aefc-4b6b-934a-cd53db7aa029-kube-api-access-4xjcn" (OuterVolumeSpecName: "kube-api-access-4xjcn") pod "2d10501c-aefc-4b6b-934a-cd53db7aa029" (UID: "2d10501c-aefc-4b6b-934a-cd53db7aa029"). InnerVolumeSpecName "kube-api-access-4xjcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.988870 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-config-data" (OuterVolumeSpecName: "config-data") pod "2d10501c-aefc-4b6b-934a-cd53db7aa029" (UID: "2d10501c-aefc-4b6b-934a-cd53db7aa029"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:39 crc kubenswrapper[4772]: I0127 15:30:39.990861 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d10501c-aefc-4b6b-934a-cd53db7aa029" (UID: "2d10501c-aefc-4b6b-934a-cd53db7aa029"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.056687 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.056753 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.056773 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d10501c-aefc-4b6b-934a-cd53db7aa029-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.056791 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjcn\" (UniqueName: \"kubernetes.io/projected/2d10501c-aefc-4b6b-934a-cd53db7aa029-kube-api-access-4xjcn\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.520423 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dr2w8" event={"ID":"2d10501c-aefc-4b6b-934a-cd53db7aa029","Type":"ContainerDied","Data":"33208e54b5b97ccefb163c65dd3c0ecf279de8f0faa126d781f9e471398f64fd"} Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.520469 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dr2w8" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.520475 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33208e54b5b97ccefb163c65dd3c0ecf279de8f0faa126d781f9e471398f64fd" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.651949 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.661774 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.673468 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.692469 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.692754 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-log" containerID="cri-o://9356a4887a357fc1448800688f7032704f830e8c9f7df6f0b3fe3c97ddfa3bb7" gracePeriod=30 Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.692821 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-api" containerID="cri-o://596e46cd823d75132da321dc3e1e49f4c351f7ee177713e0655112d945b2390c" gracePeriod=30 Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.717591 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.717885 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" containerName="nova-scheduler-scheduler" containerID="cri-o://caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" gracePeriod=30 Jan 27 15:30:40 crc kubenswrapper[4772]: I0127 15:30:40.741080 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:41 crc kubenswrapper[4772]: I0127 15:30:41.530703 4772 generic.go:334] "Generic (PLEG): container finished" podID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerID="596e46cd823d75132da321dc3e1e49f4c351f7ee177713e0655112d945b2390c" exitCode=0 Jan 27 15:30:41 crc kubenswrapper[4772]: I0127 15:30:41.531011 4772 generic.go:334] "Generic (PLEG): container finished" podID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerID="9356a4887a357fc1448800688f7032704f830e8c9f7df6f0b3fe3c97ddfa3bb7" exitCode=143 Jan 27 15:30:41 crc kubenswrapper[4772]: I0127 15:30:41.530795 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"838e8a63-fd3f-4f03-9030-a2c3a4db7393","Type":"ContainerDied","Data":"596e46cd823d75132da321dc3e1e49f4c351f7ee177713e0655112d945b2390c"} Jan 27 15:30:41 crc kubenswrapper[4772]: I0127 15:30:41.531062 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"838e8a63-fd3f-4f03-9030-a2c3a4db7393","Type":"ContainerDied","Data":"9356a4887a357fc1448800688f7032704f830e8c9f7df6f0b3fe3c97ddfa3bb7"} Jan 27 15:30:41 crc kubenswrapper[4772]: I0127 15:30:41.588683 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.013226 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.103489 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/838e8a63-fd3f-4f03-9030-a2c3a4db7393-logs\") pod \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.103610 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-config-data\") pod \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.103634 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwgqb\" (UniqueName: \"kubernetes.io/projected/838e8a63-fd3f-4f03-9030-a2c3a4db7393-kube-api-access-kwgqb\") pod \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.103691 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-internal-tls-certs\") pod \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.103752 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-combined-ca-bundle\") pod \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.103792 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-public-tls-certs\") pod \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\" (UID: \"838e8a63-fd3f-4f03-9030-a2c3a4db7393\") " Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.104736 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/838e8a63-fd3f-4f03-9030-a2c3a4db7393-logs" (OuterVolumeSpecName: "logs") pod "838e8a63-fd3f-4f03-9030-a2c3a4db7393" (UID: "838e8a63-fd3f-4f03-9030-a2c3a4db7393"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.117418 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838e8a63-fd3f-4f03-9030-a2c3a4db7393-kube-api-access-kwgqb" (OuterVolumeSpecName: "kube-api-access-kwgqb") pod "838e8a63-fd3f-4f03-9030-a2c3a4db7393" (UID: "838e8a63-fd3f-4f03-9030-a2c3a4db7393"). InnerVolumeSpecName "kube-api-access-kwgqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.147813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-config-data" (OuterVolumeSpecName: "config-data") pod "838e8a63-fd3f-4f03-9030-a2c3a4db7393" (UID: "838e8a63-fd3f-4f03-9030-a2c3a4db7393"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.147932 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "838e8a63-fd3f-4f03-9030-a2c3a4db7393" (UID: "838e8a63-fd3f-4f03-9030-a2c3a4db7393"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.181966 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "838e8a63-fd3f-4f03-9030-a2c3a4db7393" (UID: "838e8a63-fd3f-4f03-9030-a2c3a4db7393"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.205712 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.205740 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/838e8a63-fd3f-4f03-9030-a2c3a4db7393-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.205750 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.205759 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwgqb\" (UniqueName: \"kubernetes.io/projected/838e8a63-fd3f-4f03-9030-a2c3a4db7393-kube-api-access-kwgqb\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.205771 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.239918 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "838e8a63-fd3f-4f03-9030-a2c3a4db7393" (UID: "838e8a63-fd3f-4f03-9030-a2c3a4db7393"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.307482 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/838e8a63-fd3f-4f03-9030-a2c3a4db7393-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.544773 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerStarted","Data":"7b0085db2ce3021657d7773e88196b66b6759beeca3bff2b51fc3fdf5d6b4bd2"} Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.546133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"838e8a63-fd3f-4f03-9030-a2c3a4db7393","Type":"ContainerDied","Data":"f7d96b6a64da6759e6a7a8bc3df6aebfecfa7e5ec0b668d726902cda154a1888"} Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.546253 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.546283 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-log" containerID="cri-o://2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba" gracePeriod=30 Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.546339 4772 scope.go:117] "RemoveContainer" containerID="596e46cd823d75132da321dc3e1e49f4c351f7ee177713e0655112d945b2390c" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.546390 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-metadata" containerID="cri-o://5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2" gracePeriod=30 Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.587437 4772 scope.go:117] "RemoveContainer" containerID="9356a4887a357fc1448800688f7032704f830e8c9f7df6f0b3fe3c97ddfa3bb7" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.638880 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.675919 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.679292 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:42 crc kubenswrapper[4772]: E0127 15:30:42.679793 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-log" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.679816 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-log" Jan 27 15:30:42 crc kubenswrapper[4772]: E0127 15:30:42.679831 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d10501c-aefc-4b6b-934a-cd53db7aa029" containerName="nova-manage" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.679838 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d10501c-aefc-4b6b-934a-cd53db7aa029" containerName="nova-manage" Jan 27 15:30:42 crc kubenswrapper[4772]: E0127 15:30:42.679891 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-api" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.679905 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-api" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.680132 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d10501c-aefc-4b6b-934a-cd53db7aa029" containerName="nova-manage" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.680154 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-api" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.680183 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" containerName="nova-api-log" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.681229 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.683442 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.683788 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.684300 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.696046 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.716260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-config-data\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.716344 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.716375 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-logs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.716478 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.716511 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmnhr\" (UniqueName: \"kubernetes.io/projected/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-kube-api-access-pmnhr\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.716536 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-public-tls-certs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.817707 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.817756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmnhr\" (UniqueName: \"kubernetes.io/projected/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-kube-api-access-pmnhr\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.817781 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-public-tls-certs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.817879 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-config-data\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.817927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.817941 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-logs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.818427 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-logs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.821729 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.822917 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.822970 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-config-data\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.824257 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-public-tls-certs\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.834383 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmnhr\" (UniqueName: \"kubernetes.io/projected/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-kube-api-access-pmnhr\") pod \"nova-api-0\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " pod="openstack/nova-api-0" Jan 27 15:30:42 crc kubenswrapper[4772]: I0127 15:30:42.999157 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:30:43 crc kubenswrapper[4772]: E0127 15:30:43.315219 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:30:43 crc kubenswrapper[4772]: E0127 15:30:43.318087 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:30:43 crc kubenswrapper[4772]: E0127 15:30:43.319589 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:30:43 crc kubenswrapper[4772]: E0127 15:30:43.319679 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" containerName="nova-scheduler-scheduler" Jan 27 15:30:43 crc kubenswrapper[4772]: I0127 15:30:43.475981 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:30:43 crc kubenswrapper[4772]: I0127 15:30:43.559336 4772 generic.go:334] "Generic (PLEG): container finished" podID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerID="2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba" exitCode=143 Jan 27 15:30:43 crc kubenswrapper[4772]: I0127 15:30:43.559425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3293f51d-380b-4247-b1ca-5d1f4b831e52","Type":"ContainerDied","Data":"2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba"} Jan 27 15:30:43 crc kubenswrapper[4772]: I0127 15:30:43.561432 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93c8f9a4-c6ef-42b8-8543-ff8b5347977e","Type":"ContainerStarted","Data":"dfddffa6f559c177ea99d7f7fef5a8fb81a5dc7c7f2005faaf77278166e23279"} Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.382395 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.466996 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt2mz\" (UniqueName: \"kubernetes.io/projected/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-kube-api-access-rt2mz\") pod \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.467099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-config-data\") pod \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.467205 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-combined-ca-bundle\") pod \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\" (UID: \"c0a8fd83-de04-4458-8ef8-ebe7ae60194f\") " Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.475020 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-kube-api-access-rt2mz" (OuterVolumeSpecName: "kube-api-access-rt2mz") pod "c0a8fd83-de04-4458-8ef8-ebe7ae60194f" (UID: "c0a8fd83-de04-4458-8ef8-ebe7ae60194f"). InnerVolumeSpecName "kube-api-access-rt2mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.501340 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-config-data" (OuterVolumeSpecName: "config-data") pod "c0a8fd83-de04-4458-8ef8-ebe7ae60194f" (UID: "c0a8fd83-de04-4458-8ef8-ebe7ae60194f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.507359 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0a8fd83-de04-4458-8ef8-ebe7ae60194f" (UID: "c0a8fd83-de04-4458-8ef8-ebe7ae60194f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.569777 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.569809 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt2mz\" (UniqueName: \"kubernetes.io/projected/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-kube-api-access-rt2mz\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.569822 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a8fd83-de04-4458-8ef8-ebe7ae60194f-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.580385 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93c8f9a4-c6ef-42b8-8543-ff8b5347977e","Type":"ContainerStarted","Data":"abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246"} Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.580438 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93c8f9a4-c6ef-42b8-8543-ff8b5347977e","Type":"ContainerStarted","Data":"db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597"} Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.584653 4772 generic.go:334] "Generic (PLEG): container finished" podID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" exitCode=0 Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.584729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a8fd83-de04-4458-8ef8-ebe7ae60194f","Type":"ContainerDied","Data":"caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e"} Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.584829 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0a8fd83-de04-4458-8ef8-ebe7ae60194f","Type":"ContainerDied","Data":"4d588204c22b2f28d3f4e69bf8dac95975b2db88030c7b5662a17b491819b8ab"} Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.584866 4772 scope.go:117] "RemoveContainer" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.584775 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.592624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerStarted","Data":"0447c2ea1d147e4cee27fce146e4edc38d746774dc492452f5da3c48df7973bb"} Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.593661 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.618484 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.618460994 podStartE2EDuration="2.618460994s" podCreationTimestamp="2026-01-27 15:30:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:44.60480723 +0000 UTC m=+1430.585416328" watchObservedRunningTime="2026-01-27 15:30:44.618460994 +0000 UTC m=+1430.599070102" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.622936 4772 scope.go:117] "RemoveContainer" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" Jan 27 15:30:44 crc kubenswrapper[4772]: E0127 15:30:44.625756 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e\": container with ID starting with caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e not found: ID does not exist" containerID="caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.625805 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e"} err="failed to get container status \"caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e\": rpc error: code = NotFound desc = could not find container \"caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e\": container with ID starting with caa4ecd4bd8e1eda4d938d7b43a4c7d54c06fe65669c022620e5448e98a4584e not found: ID does not exist" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.643498 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.226216639 podStartE2EDuration="9.643468717s" podCreationTimestamp="2026-01-27 15:30:35 +0000 UTC" firstStartedPulling="2026-01-27 15:30:36.378154601 +0000 UTC m=+1422.358763709" lastFinishedPulling="2026-01-27 15:30:43.795406689 +0000 UTC m=+1429.776015787" observedRunningTime="2026-01-27 15:30:44.628873715 +0000 UTC m=+1430.609482813" watchObservedRunningTime="2026-01-27 15:30:44.643468717 +0000 UTC m=+1430.624077915" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.679595 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="838e8a63-fd3f-4f03-9030-a2c3a4db7393" path="/var/lib/kubelet/pods/838e8a63-fd3f-4f03-9030-a2c3a4db7393/volumes" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.680208 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.687626 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.710833 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:30:44 crc kubenswrapper[4772]: E0127 15:30:44.711294 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" containerName="nova-scheduler-scheduler" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.711313 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" containerName="nova-scheduler-scheduler" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.711505 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" containerName="nova-scheduler-scheduler" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.712105 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.713949 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.719738 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.773136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.773349 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-config-data\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.773449 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq4lp\" (UniqueName: \"kubernetes.io/projected/b83f7578-8113-46c8-be24-5968aa0ca563-kube-api-access-hq4lp\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.875224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-config-data\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.875296 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq4lp\" (UniqueName: \"kubernetes.io/projected/b83f7578-8113-46c8-be24-5968aa0ca563-kube-api-access-hq4lp\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.875416 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.879426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-config-data\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.879658 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:44 crc kubenswrapper[4772]: I0127 15:30:44.906776 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq4lp\" (UniqueName: \"kubernetes.io/projected/b83f7578-8113-46c8-be24-5968aa0ca563-kube-api-access-hq4lp\") pod \"nova-scheduler-0\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " pod="openstack/nova-scheduler-0" Jan 27 15:30:45 crc kubenswrapper[4772]: I0127 15:30:45.038811 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:30:45 crc kubenswrapper[4772]: I0127 15:30:45.516199 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:30:45 crc kubenswrapper[4772]: W0127 15:30:45.520338 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb83f7578_8113_46c8_be24_5968aa0ca563.slice/crio-ee6d65efb4f3df3d96335b7b6b58d4ee20a12c71c0ca644b8c8c4208300d2710 WatchSource:0}: Error finding container ee6d65efb4f3df3d96335b7b6b58d4ee20a12c71c0ca644b8c8c4208300d2710: Status 404 returned error can't find the container with id ee6d65efb4f3df3d96335b7b6b58d4ee20a12c71c0ca644b8c8c4208300d2710 Jan 27 15:30:45 crc kubenswrapper[4772]: I0127 15:30:45.617581 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b83f7578-8113-46c8-be24-5968aa0ca563","Type":"ContainerStarted","Data":"ee6d65efb4f3df3d96335b7b6b58d4ee20a12c71c0ca644b8c8c4208300d2710"} Jan 27 15:30:45 crc kubenswrapper[4772]: I0127 15:30:45.702973 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:37576->10.217.0.199:8775: read: connection reset by peer" Jan 27 15:30:45 crc kubenswrapper[4772]: I0127 15:30:45.703507 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:37582->10.217.0.199:8775: read: connection reset by peer" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.175109 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.306449 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-nova-metadata-tls-certs\") pod \"3293f51d-380b-4247-b1ca-5d1f4b831e52\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.306603 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3293f51d-380b-4247-b1ca-5d1f4b831e52-logs\") pod \"3293f51d-380b-4247-b1ca-5d1f4b831e52\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.307052 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3293f51d-380b-4247-b1ca-5d1f4b831e52-logs" (OuterVolumeSpecName: "logs") pod "3293f51d-380b-4247-b1ca-5d1f4b831e52" (UID: "3293f51d-380b-4247-b1ca-5d1f4b831e52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.307094 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6whx\" (UniqueName: \"kubernetes.io/projected/3293f51d-380b-4247-b1ca-5d1f4b831e52-kube-api-access-v6whx\") pod \"3293f51d-380b-4247-b1ca-5d1f4b831e52\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.307194 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-combined-ca-bundle\") pod \"3293f51d-380b-4247-b1ca-5d1f4b831e52\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.307550 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-config-data\") pod \"3293f51d-380b-4247-b1ca-5d1f4b831e52\" (UID: \"3293f51d-380b-4247-b1ca-5d1f4b831e52\") " Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.307954 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3293f51d-380b-4247-b1ca-5d1f4b831e52-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.332468 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3293f51d-380b-4247-b1ca-5d1f4b831e52-kube-api-access-v6whx" (OuterVolumeSpecName: "kube-api-access-v6whx") pod "3293f51d-380b-4247-b1ca-5d1f4b831e52" (UID: "3293f51d-380b-4247-b1ca-5d1f4b831e52"). InnerVolumeSpecName "kube-api-access-v6whx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.335509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3293f51d-380b-4247-b1ca-5d1f4b831e52" (UID: "3293f51d-380b-4247-b1ca-5d1f4b831e52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.342231 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-config-data" (OuterVolumeSpecName: "config-data") pod "3293f51d-380b-4247-b1ca-5d1f4b831e52" (UID: "3293f51d-380b-4247-b1ca-5d1f4b831e52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.360561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3293f51d-380b-4247-b1ca-5d1f4b831e52" (UID: "3293f51d-380b-4247-b1ca-5d1f4b831e52"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.409489 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.409519 4772 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.409529 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6whx\" (UniqueName: \"kubernetes.io/projected/3293f51d-380b-4247-b1ca-5d1f4b831e52-kube-api-access-v6whx\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.409537 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3293f51d-380b-4247-b1ca-5d1f4b831e52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.638570 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b83f7578-8113-46c8-be24-5968aa0ca563","Type":"ContainerStarted","Data":"e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e"} Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.644970 4772 generic.go:334] "Generic (PLEG): container finished" podID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerID="5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2" exitCode=0 Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.645862 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.653102 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3293f51d-380b-4247-b1ca-5d1f4b831e52","Type":"ContainerDied","Data":"5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2"} Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.653204 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3293f51d-380b-4247-b1ca-5d1f4b831e52","Type":"ContainerDied","Data":"bc2f5b8265c22782479449303b876d41140ba5bbf29e25aebf07297950204c86"} Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.653231 4772 scope.go:117] "RemoveContainer" containerID="5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.663764 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6637495810000003 podStartE2EDuration="2.663749581s" podCreationTimestamp="2026-01-27 15:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:46.658765107 +0000 UTC m=+1432.639374255" watchObservedRunningTime="2026-01-27 15:30:46.663749581 +0000 UTC m=+1432.644358679" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.684365 4772 scope.go:117] "RemoveContainer" containerID="2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.687570 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a8fd83-de04-4458-8ef8-ebe7ae60194f" path="/var/lib/kubelet/pods/c0a8fd83-de04-4458-8ef8-ebe7ae60194f/volumes" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.712428 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.713668 4772 scope.go:117] "RemoveContainer" containerID="5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.725468 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:46 crc kubenswrapper[4772]: E0127 15:30:46.725630 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2\": container with ID starting with 5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2 not found: ID does not exist" containerID="5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.725803 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2"} err="failed to get container status \"5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2\": rpc error: code = NotFound desc = could not find container \"5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2\": container with ID starting with 5263e57d66912aeb6763dff6b7e8221a46b29b91e8c0bceeae9c0497237e7ec2 not found: ID does not exist" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.725847 4772 scope.go:117] "RemoveContainer" containerID="2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba" Jan 27 15:30:46 crc kubenswrapper[4772]: E0127 15:30:46.726327 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba\": container with ID starting with 2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba not found: ID does not exist" containerID="2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.726370 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba"} err="failed to get container status \"2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba\": rpc error: code = NotFound desc = could not find container \"2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba\": container with ID starting with 2fca126b35386587e45cc8336f3cbcb29951790c38ccf77f0aca8a45525bbaba not found: ID does not exist" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.734356 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:46 crc kubenswrapper[4772]: E0127 15:30:46.734725 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-log" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.734740 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-log" Jan 27 15:30:46 crc kubenswrapper[4772]: E0127 15:30:46.734761 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-metadata" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.734769 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-metadata" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.734952 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-log" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.734973 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" containerName="nova-metadata-metadata" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.736056 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.739398 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.739675 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.740517 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.820286 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.820400 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-config-data\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.820430 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.820451 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b44p\" (UniqueName: \"kubernetes.io/projected/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-kube-api-access-6b44p\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.820484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-logs\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.921641 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.921745 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-config-data\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.921764 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.921790 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b44p\" (UniqueName: \"kubernetes.io/projected/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-kube-api-access-6b44p\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.921820 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-logs\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.922293 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-logs\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.926449 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.926888 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-config-data\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.928911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:46 crc kubenswrapper[4772]: I0127 15:30:46.943821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b44p\" (UniqueName: \"kubernetes.io/projected/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-kube-api-access-6b44p\") pod \"nova-metadata-0\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " pod="openstack/nova-metadata-0" Jan 27 15:30:47 crc kubenswrapper[4772]: I0127 15:30:47.051652 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:30:47 crc kubenswrapper[4772]: I0127 15:30:47.517567 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:30:47 crc kubenswrapper[4772]: I0127 15:30:47.666572 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7","Type":"ContainerStarted","Data":"a940184dde4998665ff3925c8d268f050f05912d1265137578965cd151d251c3"} Jan 27 15:30:48 crc kubenswrapper[4772]: I0127 15:30:48.737682 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3293f51d-380b-4247-b1ca-5d1f4b831e52" path="/var/lib/kubelet/pods/3293f51d-380b-4247-b1ca-5d1f4b831e52/volumes" Jan 27 15:30:48 crc kubenswrapper[4772]: I0127 15:30:48.740662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7","Type":"ContainerStarted","Data":"7343cd6a2a5cf705b558b4cc862d749d392235682218489d0106143cb8a5d4bc"} Jan 27 15:30:49 crc kubenswrapper[4772]: I0127 15:30:49.737283 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7","Type":"ContainerStarted","Data":"db38347574e8ea3471da74617b5c2b8fd8e23430f530dbd434f5aba2a153f9bb"} Jan 27 15:30:49 crc kubenswrapper[4772]: I0127 15:30:49.769234 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.769205734 podStartE2EDuration="3.769205734s" podCreationTimestamp="2026-01-27 15:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:30:49.76144535 +0000 UTC m=+1435.742054468" watchObservedRunningTime="2026-01-27 15:30:49.769205734 +0000 UTC m=+1435.749814832" Jan 27 15:30:50 crc kubenswrapper[4772]: I0127 15:30:50.039222 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 15:30:52 crc kubenswrapper[4772]: I0127 15:30:52.052568 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 15:30:52 crc kubenswrapper[4772]: I0127 15:30:52.052828 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 15:30:53 crc kubenswrapper[4772]: I0127 15:30:53.000903 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 15:30:53 crc kubenswrapper[4772]: I0127 15:30:53.000985 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 15:30:54 crc kubenswrapper[4772]: I0127 15:30:54.013355 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:54 crc kubenswrapper[4772]: I0127 15:30:54.013373 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:55 crc kubenswrapper[4772]: I0127 15:30:55.039700 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 15:30:55 crc kubenswrapper[4772]: I0127 15:30:55.066710 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 15:30:55 crc kubenswrapper[4772]: I0127 15:30:55.836380 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 15:30:57 crc kubenswrapper[4772]: I0127 15:30:57.052425 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 15:30:57 crc kubenswrapper[4772]: I0127 15:30:57.053467 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 15:30:58 crc kubenswrapper[4772]: I0127 15:30:58.066460 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 15:30:58 crc kubenswrapper[4772]: I0127 15:30:58.067113 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 15:31:03 crc kubenswrapper[4772]: I0127 15:31:03.007151 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 15:31:03 crc kubenswrapper[4772]: I0127 15:31:03.008099 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 15:31:03 crc kubenswrapper[4772]: I0127 15:31:03.013510 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 15:31:03 crc kubenswrapper[4772]: I0127 15:31:03.016132 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 15:31:03 crc kubenswrapper[4772]: I0127 15:31:03.886277 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 15:31:03 crc kubenswrapper[4772]: I0127 15:31:03.892936 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 15:31:05 crc kubenswrapper[4772]: I0127 15:31:05.897268 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 15:31:07 crc kubenswrapper[4772]: I0127 15:31:07.056758 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 15:31:07 crc kubenswrapper[4772]: I0127 15:31:07.058023 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 15:31:07 crc kubenswrapper[4772]: I0127 15:31:07.070576 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 15:31:07 crc kubenswrapper[4772]: I0127 15:31:07.926279 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.060962 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.061841 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="0edf6707-14dd-4986-8d64-0e48a31d6a39" containerName="openstackclient" containerID="cri-o://0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa" gracePeriod=2 Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.080543 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.108614 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-547fc"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.123130 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-547fc"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.251629 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qmppl"] Jan 27 15:31:26 crc kubenswrapper[4772]: E0127 15:31:26.252129 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edf6707-14dd-4986-8d64-0e48a31d6a39" containerName="openstackclient" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.252152 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edf6707-14dd-4986-8d64-0e48a31d6a39" containerName="openstackclient" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.252432 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edf6707-14dd-4986-8d64-0e48a31d6a39" containerName="openstackclient" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.253124 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.259159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts\") pod \"root-account-create-update-qmppl\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.259278 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thp4z\" (UniqueName: \"kubernetes.io/projected/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-kube-api-access-thp4z\") pod \"root-account-create-update-qmppl\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.264732 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.290545 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qmppl"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.311809 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97c3-account-create-update-xlghl"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.413334 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts\") pod \"root-account-create-update-qmppl\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.417727 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts\") pod \"root-account-create-update-qmppl\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.418196 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thp4z\" (UniqueName: \"kubernetes.io/projected/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-kube-api-access-thp4z\") pod \"root-account-create-update-qmppl\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.427244 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-97c3-account-create-update-xlghl"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.499261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thp4z\" (UniqueName: \"kubernetes.io/projected/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-kube-api-access-thp4z\") pod \"root-account-create-update-qmppl\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.549858 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-97c3-account-create-update-bvlvs"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.551534 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.567841 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.583185 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.632214 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j94f\" (UniqueName: \"kubernetes.io/projected/ef060591-3809-4f0b-974f-0785261db9b9-kube-api-access-2j94f\") pod \"barbican-97c3-account-create-update-bvlvs\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.632359 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef060591-3809-4f0b-974f-0785261db9b9-operator-scripts\") pod \"barbican-97c3-account-create-update-bvlvs\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.654182 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-97c3-account-create-update-bvlvs"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.694951 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af61fb8e-e749-4872-8dc6-c590e4b9787a" path="/var/lib/kubelet/pods/af61fb8e-e749-4872-8dc6-c590e4b9787a/volumes" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.695636 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1eaad6-cd29-4189-8ecd-62b7658e69ef" path="/var/lib/kubelet/pods/dd1eaad6-cd29-4189-8ecd-62b7658e69ef/volumes" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.730280 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a648-account-create-update-qhx8z"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.738407 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef060591-3809-4f0b-974f-0785261db9b9-operator-scripts\") pod \"barbican-97c3-account-create-update-bvlvs\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.738637 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j94f\" (UniqueName: \"kubernetes.io/projected/ef060591-3809-4f0b-974f-0785261db9b9-kube-api-access-2j94f\") pod \"barbican-97c3-account-create-update-bvlvs\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.739783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef060591-3809-4f0b-974f-0785261db9b9-operator-scripts\") pod \"barbican-97c3-account-create-update-bvlvs\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.800576 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a648-account-create-update-qhx8z"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.834372 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j94f\" (UniqueName: \"kubernetes.io/projected/ef060591-3809-4f0b-974f-0785261db9b9-kube-api-access-2j94f\") pod \"barbican-97c3-account-create-update-bvlvs\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.864274 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.908912 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.930271 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.930619 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="ovn-northd" containerID="cri-o://f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a" gracePeriod=30 Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.931119 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="openstack-network-exporter" containerID="cri-o://b1542ba131aec1cffd5520f2969b843d3aa12fe7b4cd60022addce3e73977b99" gracePeriod=30 Jan 27 15:31:26 crc kubenswrapper[4772]: I0127 15:31:26.989238 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d2a3-account-create-update-hfkkb"] Jan 27 15:31:27 crc kubenswrapper[4772]: E0127 15:31:27.023355 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:27 crc kubenswrapper[4772]: E0127 15:31:27.023743 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data podName:76fdbdb1-d48a-4cd1-8372-78887671dce8 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:27.523722509 +0000 UTC m=+1473.504331597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data") pod "rabbitmq-cell1-server-0" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8") : configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.029968 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e8b1-account-create-update-8rlww"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.065816 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-cqx7r"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.107235 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d2a3-account-create-update-hfkkb"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.114032 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e8b1-account-create-update-8rlww"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.147616 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gxjzh"] Jan 27 15:31:27 crc kubenswrapper[4772]: E0127 15:31:27.156710 4772 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.134:35824->38.129.56.134:35895: write tcp 38.129.56.134:35824->38.129.56.134:35895: write: broken pipe Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.200357 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-vqpfg"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.200538 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-vqpfg" podUID="a490a71b-c33d-4c94-9592-f97d1d315e81" containerName="openstack-network-exporter" containerID="cri-o://b93ad84c922746d427d3e2a2deb04a875a239fcafbecb5146ae05b1b11e36a09" gracePeriod=30 Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.254319 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-vdmv7"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.286377 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-vdmv7"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.326226 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-8l85z"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.348835 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pmk27"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.386217 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-8l85z"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.416732 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pmk27"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.465278 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zf2tx"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.498315 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zf2tx"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.542407 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-d4llz"] Jan 27 15:31:27 crc kubenswrapper[4772]: E0127 15:31:27.543559 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:27 crc kubenswrapper[4772]: E0127 15:31:27.543608 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data podName:76fdbdb1-d48a-4cd1-8372-78887671dce8 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:28.543593843 +0000 UTC m=+1474.524202941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data") pod "rabbitmq-cell1-server-0" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8") : configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.566877 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-d4llz"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.598259 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5104-account-create-update-vp7x7"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.632600 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5104-account-create-update-vp7x7"] Jan 27 15:31:27 crc kubenswrapper[4772]: I0127 15:31:27.657934 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-v689b"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.700513 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.755594 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-v689b"] Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:27.805665 4772 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-gxjzh" message="Exiting ovn-controller (1) " Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:27.805738 4772 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-gxjzh" podUID="220011f2-8778-4a14-82d4-33a07bd33379" containerName="ovn-controller" containerID="cri-o://afc8ab10fea0840566de64c53bc97d22454ee25e120ead660e5999b0da009daf" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.813860 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-gxjzh" podUID="220011f2-8778-4a14-82d4-33a07bd33379" containerName="ovn-controller" containerID="cri-o://afc8ab10fea0840566de64c53bc97d22454ee25e120ead660e5999b0da009daf" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.821619 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.823025 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="openstack-network-exporter" containerID="cri-o://9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4" gracePeriod=300 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.891459 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dr2w8"] Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:27.895644 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:27.895698 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data podName:508c3d5b-212a-46da-9a55-de3f35d7019b nodeName:}" failed. No retries permitted until 2026-01-27 15:31:28.395682848 +0000 UTC m=+1474.376291946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data") pod "rabbitmq-server-0" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b") : configmap "rabbitmq-config-data" not found Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.925668 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dr2w8"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.935685 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6af8-account-create-update-ltwnh"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.952356 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6af8-account-create-update-ltwnh"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.971293 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5ch7"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:27.982509 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h5ch7"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.002237 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-2hd4f"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.002508 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="dnsmasq-dns" containerID="cri-o://0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86" gracePeriod=10 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.003983 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.004345 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="openstack-network-exporter" containerID="cri-o://fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed" gracePeriod=300 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.040613 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="ovsdbserver-nb" containerID="cri-o://abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c" gracePeriod=300 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.077498 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.078016 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-server" containerID="cri-o://d35aa807e61d39133b8319305719556fcfa6889495c80253864eaf2dc48a450b" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082606 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-updater" containerID="cri-o://99c9f47c0720632dfecbfc5e9152885ab96d751677b561767c79f0a032ca5cf5" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082763 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="swift-recon-cron" containerID="cri-o://0b50101071feccad5793667a8f4849d22482c6d522fac228c249d69d6d557cdf" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082808 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="rsync" containerID="cri-o://8d889567d10b3e8868d76680ff442da2a14216919aae766c356918ec9960b9a4" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082846 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-expirer" containerID="cri-o://c1cf3012e8501ba3a809e028a1ab49c960d95fb090a04b4dbca6cd01d2de9524" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082900 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-updater" containerID="cri-o://b0a7c137687a720a7d8c3f84cc586f4b9d3bde7c9bc9e2e0c83a325c2ae23322" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082935 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-auditor" containerID="cri-o://8bbb31c1be222187b0e9b27f07c1ac0fe66d8ad583df4ff6b26fec62ab98cf87" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082966 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-replicator" containerID="cri-o://71b4242b9081be055bfb8bd2db6959d32259cd0c3ee2b95ddde1c1d2154be74b" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.082996 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-server" containerID="cri-o://bc57f117c387fb10832190ea21f63cdb319308d9390292395fb515e28966d217" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.083044 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-reaper" containerID="cri-o://ac32767b3784713a66fbfe32a337398a7461aa8ffad58bbfea7ccf6e3c4ee19d" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.083076 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-auditor" containerID="cri-o://0c6f6ecf89a4947c23560538762ca73dfe5e13c4acb04e206d91772a3cfc9c49" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.083113 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-replicator" containerID="cri-o://94e4c588a745acb16ce919a52f7150cf54119c1c41e94c9e658206e6b58958ed" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.083188 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-auditor" containerID="cri-o://c3f602f5b8fe5f978c40989adc1d0130c6aaae0dce0fc13d5e34bbe819e8eccb" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.083226 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-replicator" containerID="cri-o://5f271cd2dcb6b658cde722402c5b2945c28f4d7486cab8c56e064081779416a1" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.083188 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-server" containerID="cri-o://494d3ebaeddb756bf375d2bc394a4b4086ee3e25d9a76747552d41c1f40a9737" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.121389 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.121661 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-log" containerID="cri-o://d767e789b4befb7b8caac693075691222c00bb6ae1189417345706dad41621f9" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.122039 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-httpd" containerID="cri-o://3114715e24bc63a93ce31ec7ec2cc2fdeaad0a6c7647de22f23d06ac45e3d864" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.163287 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.163647 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="cinder-scheduler" containerID="cri-o://3e806373a2604b5465de7a3913d6865c82f0689bac61f26c430950d7d4efb948" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.164156 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="probe" containerID="cri-o://112ddc6068b3694383f83c1ffece42788a7623920d1c02ff9f46202f7c8c0d7e" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.209200 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dpr42"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.251759 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dpr42"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.267595 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-597699949b-q6msx"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.267819 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-597699949b-q6msx" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-log" containerID="cri-o://f10ed54f4ea68e56be83b8d8387a9768612b5c035b1fc42928132066af5bd689" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.268149 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-597699949b-q6msx" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-api" containerID="cri-o://ad26ca4835a223df0b0aa3065e02d9e54b67030d2b6d0436f1f1a0dd7bf06415" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.276039 4772 generic.go:334] "Generic (PLEG): container finished" podID="220011f2-8778-4a14-82d4-33a07bd33379" containerID="afc8ab10fea0840566de64c53bc97d22454ee25e120ead660e5999b0da009daf" exitCode=0 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.276118 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh" event={"ID":"220011f2-8778-4a14-82d4-33a07bd33379","Type":"ContainerDied","Data":"afc8ab10fea0840566de64c53bc97d22454ee25e120ead660e5999b0da009daf"} Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.313325 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2/ovsdbserver-nb/0.log" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.313381 4772 generic.go:334] "Generic (PLEG): container finished" podID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerID="9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4" exitCode=2 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.313472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2","Type":"ContainerDied","Data":"9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4"} Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.333837 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.334094 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api-log" containerID="cri-o://26cc6d1f580535edc969fb0f7d0d2e7d716fa8450f944ca1657554f90801529b" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.334384 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api" containerID="cri-o://c47159ab0aee5087f5a44073988d2ad8d6aaaa0e47ba7702dc2a03eab229b375" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.339290 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vqpfg_a490a71b-c33d-4c94-9592-f97d1d315e81/openstack-network-exporter/0.log" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.339350 4772 generic.go:334] "Generic (PLEG): container finished" podID="a490a71b-c33d-4c94-9592-f97d1d315e81" containerID="b93ad84c922746d427d3e2a2deb04a875a239fcafbecb5146ae05b1b11e36a09" exitCode=2 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.339468 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vqpfg" event={"ID":"a490a71b-c33d-4c94-9592-f97d1d315e81","Type":"ContainerDied","Data":"b93ad84c922746d427d3e2a2deb04a875a239fcafbecb5146ae05b1b11e36a09"} Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.363917 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.364228 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-log" containerID="cri-o://3454f9899adaff309b52934e71697924735c1f269fb473444cba03b5baf4e1e5" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.364767 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-httpd" containerID="cri-o://6481b50eed7f8997cc197c4b50a1b5d1b9aa395b3745aa30ff2d6ee451d23215" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.366471 4772 generic.go:334] "Generic (PLEG): container finished" podID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerID="b1542ba131aec1cffd5520f2969b843d3aa12fe7b4cd60022addce3e73977b99" exitCode=2 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.366515 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938","Type":"ContainerDied","Data":"b1542ba131aec1cffd5520f2969b843d3aa12fe7b4cd60022addce3e73977b99"} Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.373745 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-647c88bb6f-wzf82"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.374036 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-647c88bb6f-wzf82" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-api" containerID="cri-o://72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.374724 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-647c88bb6f-wzf82" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-httpd" containerID="cri-o://b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:28.417422 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:28.417510 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data podName:508c3d5b-212a-46da-9a55-de3f35d7019b nodeName:}" failed. No retries permitted until 2026-01-27 15:31:29.417476347 +0000 UTC m=+1475.398085445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data") pod "rabbitmq-server-0" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b") : configmap "rabbitmq-config-data" not found Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.429039 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="ovsdbserver-sb" containerID="cri-o://abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593" gracePeriod=300 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.448263 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.448639 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-log" containerID="cri-o://db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.480256 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-api" containerID="cri-o://abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.501566 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ed9a-account-create-update-b7pnl"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.515710 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ed9a-account-create-update-b7pnl"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.530084 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-z224f"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.543287 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-z224f"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.559215 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-556764fb84-r628x"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.559479 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-556764fb84-r628x" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener-log" containerID="cri-o://f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.561582 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-556764fb84-r628x" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener" containerID="cri-o://aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.584260 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-v7ncm"] Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:28.622490 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:28 crc kubenswrapper[4772]: E0127 15:31:28.622548 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data podName:76fdbdb1-d48a-4cd1-8372-78887671dce8 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:30.622534893 +0000 UTC m=+1476.603143991 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data") pod "rabbitmq-cell1-server-0" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8") : configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.648789 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-v7ncm"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.656904 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.658043 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-log" containerID="cri-o://7343cd6a2a5cf705b558b4cc862d749d392235682218489d0106143cb8a5d4bc" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.658194 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-metadata" containerID="cri-o://db38347574e8ea3471da74617b5c2b8fd8e23430f530dbd434f5aba2a153f9bb" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.683606 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2313c291-4eb5-4b79-ad9b-b04cd06a1ee9" path="/var/lib/kubelet/pods/2313c291-4eb5-4b79-ad9b-b04cd06a1ee9/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.703733 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d10501c-aefc-4b6b-934a-cd53db7aa029" path="/var/lib/kubelet/pods/2d10501c-aefc-4b6b-934a-cd53db7aa029/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.704634 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="564de425-5170-45df-9080-5b02579483ee" path="/var/lib/kubelet/pods/564de425-5170-45df-9080-5b02579483ee/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.705197 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a423229-06be-4934-9715-58105e1af686" path="/var/lib/kubelet/pods/5a423229-06be-4934-9715-58105e1af686/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.705711 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f24c00-a64a-4e82-a125-c0ee3fe8fa8f" path="/var/lib/kubelet/pods/69f24c00-a64a-4e82-a125-c0ee3fe8fa8f/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.706707 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752279e5-88ff-469d-a4db-2942659c7e24" path="/var/lib/kubelet/pods/752279e5-88ff-469d-a4db-2942659c7e24/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.707221 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86d0241f-ae16-400f-837c-3b43c904c91e" path="/var/lib/kubelet/pods/86d0241f-ae16-400f-837c-3b43c904c91e/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.707912 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ae05919-68bf-43d1-abd9-9908ec287bd0" path="/var/lib/kubelet/pods/9ae05919-68bf-43d1-abd9-9908ec287bd0/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.715639 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbda9cc-3ec5-4193-a7fb-ff06bdd20846" path="/var/lib/kubelet/pods/9cbda9cc-3ec5-4193-a7fb-ff06bdd20846/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.726510 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af586fb2-38ff-4e17-86bc-a7793cb3ac45" path="/var/lib/kubelet/pods/af586fb2-38ff-4e17-86bc-a7793cb3ac45/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.738442 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0625578-3b48-44c7-9082-174fce3a7e74" path="/var/lib/kubelet/pods/b0625578-3b48-44c7-9082-174fce3a7e74/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.739593 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b163780a-6dd7-4232-b0da-a22f18d36fcc" path="/var/lib/kubelet/pods/b163780a-6dd7-4232-b0da-a22f18d36fcc/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.740187 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b412abae-93af-4ae0-8cd8-7c0a827da4b3" path="/var/lib/kubelet/pods/b412abae-93af-4ae0-8cd8-7c0a827da4b3/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.741081 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54b2036-d943-4f0d-b1c4-8a47dfab5099" path="/var/lib/kubelet/pods/c54b2036-d943-4f0d-b1c4-8a47dfab5099/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.742282 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea2e7e0f-aef9-4687-932c-d21f24fd4bff" path="/var/lib/kubelet/pods/ea2e7e0f-aef9-4687-932c-d21f24fd4bff/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.752254 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1746148-2e3f-476f-9a1f-f3656d44fb0b" path="/var/lib/kubelet/pods/f1746148-2e3f-476f-9a1f-f3656d44fb0b/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.752859 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6" path="/var/lib/kubelet/pods/f2cbbc00-4796-4a0a-943d-a8d0c5cd11c6/volumes" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.753412 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.753444 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.753456 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-659485ddbb-5bnzg"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.753686 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-659485ddbb-5bnzg" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api-log" containerID="cri-o://23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.754379 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-659485ddbb-5bnzg" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api" containerID="cri-o://ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.774636 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6748df9c8c-zk7zp"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.780546 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6748df9c8c-zk7zp" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker-log" containerID="cri-o://b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.780560 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6748df9c8c-zk7zp" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker" containerID="cri-o://a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546" gracePeriod=30 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.806125 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97c3-account-create-update-bvlvs"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.806387 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.201:5353: connect: connection refused" Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.831394 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cg94r"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.849743 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-12a3-account-create-update-mdv84"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.852772 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cg94r"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.864206 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-12a3-account-create-update-mdv84"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.883800 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-xpbb6"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.893502 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6w7p7"] Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.907245 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="rabbitmq" containerID="cri-o://d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a" gracePeriod=604800 Jan 27 15:31:28 crc kubenswrapper[4772]: I0127 15:31:28.960242 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-xpbb6"] Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.010359 4772 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Jan 27 15:31:29 crc kubenswrapper[4772]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 27 15:31:29 crc kubenswrapper[4772]: + source /usr/local/bin/container-scripts/functions Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNBridge=br-int Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNRemote=tcp:localhost:6642 Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNEncapType=geneve Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNAvailabilityZones= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ EnableChassisAsGateway=true Jan 27 15:31:29 crc kubenswrapper[4772]: ++ PhysicalNetworks= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNHostName= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 27 15:31:29 crc kubenswrapper[4772]: ++ ovs_dir=/var/lib/openvswitch Jan 27 15:31:29 crc kubenswrapper[4772]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 27 15:31:29 crc kubenswrapper[4772]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 27 15:31:29 crc kubenswrapper[4772]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + cleanup_ovsdb_server_semaphore Jan 27 15:31:29 crc kubenswrapper[4772]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 27 15:31:29 crc kubenswrapper[4772]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 27 15:31:29 crc kubenswrapper[4772]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-cqx7r" message=< Jan 27 15:31:29 crc kubenswrapper[4772]: Exiting ovsdb-server (5) [ OK ] Jan 27 15:31:29 crc kubenswrapper[4772]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 27 15:31:29 crc kubenswrapper[4772]: + source /usr/local/bin/container-scripts/functions Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNBridge=br-int Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNRemote=tcp:localhost:6642 Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNEncapType=geneve Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNAvailabilityZones= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ EnableChassisAsGateway=true Jan 27 15:31:29 crc kubenswrapper[4772]: ++ PhysicalNetworks= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNHostName= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 27 15:31:29 crc kubenswrapper[4772]: ++ ovs_dir=/var/lib/openvswitch Jan 27 15:31:29 crc kubenswrapper[4772]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 27 15:31:29 crc kubenswrapper[4772]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 27 15:31:29 crc kubenswrapper[4772]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + cleanup_ovsdb_server_semaphore Jan 27 15:31:29 crc kubenswrapper[4772]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 27 15:31:29 crc kubenswrapper[4772]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 27 15:31:29 crc kubenswrapper[4772]: > Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.010418 4772 kuberuntime_container.go:691] "PreStop hook failed" err=< Jan 27 15:31:29 crc kubenswrapper[4772]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Jan 27 15:31:29 crc kubenswrapper[4772]: + source /usr/local/bin/container-scripts/functions Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNBridge=br-int Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNRemote=tcp:localhost:6642 Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNEncapType=geneve Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNAvailabilityZones= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ EnableChassisAsGateway=true Jan 27 15:31:29 crc kubenswrapper[4772]: ++ PhysicalNetworks= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ OVNHostName= Jan 27 15:31:29 crc kubenswrapper[4772]: ++ DB_FILE=/etc/openvswitch/conf.db Jan 27 15:31:29 crc kubenswrapper[4772]: ++ ovs_dir=/var/lib/openvswitch Jan 27 15:31:29 crc kubenswrapper[4772]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Jan 27 15:31:29 crc kubenswrapper[4772]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Jan 27 15:31:29 crc kubenswrapper[4772]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + sleep 0.5 Jan 27 15:31:29 crc kubenswrapper[4772]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Jan 27 15:31:29 crc kubenswrapper[4772]: + cleanup_ovsdb_server_semaphore Jan 27 15:31:29 crc kubenswrapper[4772]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Jan 27 15:31:29 crc kubenswrapper[4772]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Jan 27 15:31:29 crc kubenswrapper[4772]: > pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" containerID="cri-o://4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.010477 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" containerID="cri-o://4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" gracePeriod=29 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.016516 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6w7p7"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.038858 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-gbrww"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.050241 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pszgr"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.070417 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-gbrww"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.087879 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pszgr"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.133364 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerName="galera" containerID="cri-o://2e743dfaa62b788cb68a4d553d64cf9affaf8ef6e4da1308fddf4dc259167b69" gracePeriod=30 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.135039 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.135241 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5e69643a-e8c2-4057-a993-d5506ceeec1b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce" gracePeriod=30 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.146652 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.155675 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjwh2"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.182990 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.183239 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerName="nova-cell1-conductor-conductor" containerID="cri-o://788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" gracePeriod=30 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.207726 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjwh2"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.233394 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.233635 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="b20b9215-5398-4100-bac4-763daa5ed222" containerName="nova-cell0-conductor-conductor" containerID="cri-o://2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" gracePeriod=30 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.240606 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-v9mqp"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.258229 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-v9mqp"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.259703 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" containerID="cri-o://d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" gracePeriod=28 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.268263 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qmppl"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.279830 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.280702 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.280852 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" containerName="nova-scheduler-scheduler" containerID="cri-o://e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" gracePeriod=30 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.293599 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="rabbitmq" containerID="cri-o://f002759dea4443f7600e0f76f24481c1604449a5ee31bd8aa53171a2121ec4b2" gracePeriod=604800 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.321282 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.343544 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-svc\") pod \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.343822 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fqp8\" (UniqueName: \"kubernetes.io/projected/fddd5e59-3124-4a05-aafd-92d6aea05f7e-kube-api-access-6fqp8\") pod \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.343960 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-swift-storage-0\") pod \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.344093 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-sb\") pod \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.344293 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-nb\") pod \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.344694 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-config\") pod \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\" (UID: \"fddd5e59-3124-4a05-aafd-92d6aea05f7e\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.352988 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fddd5e59-3124-4a05-aafd-92d6aea05f7e-kube-api-access-6fqp8" (OuterVolumeSpecName: "kube-api-access-6fqp8") pod "fddd5e59-3124-4a05-aafd-92d6aea05f7e" (UID: "fddd5e59-3124-4a05-aafd-92d6aea05f7e"). InnerVolumeSpecName "kube-api-access-6fqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.353483 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2/ovsdbserver-nb/0.log" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.353738 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.365413 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.366800 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vqpfg_a490a71b-c33d-4c94-9592-f97d1d315e81/openstack-network-exporter/0.log" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.366846 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.381457 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.395513 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.419265 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.419329 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerName="nova-cell1-conductor-conductor" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.432330 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dc34a3a4-ad0b-4154-82c9-728227b19732/ovsdbserver-sb/0.log" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.432421 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.440534 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fddd5e59-3124-4a05-aafd-92d6aea05f7e" (UID: "fddd5e59-3124-4a05-aafd-92d6aea05f7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.444693 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fddd5e59-3124-4a05-aafd-92d6aea05f7e" (UID: "fddd5e59-3124-4a05-aafd-92d6aea05f7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.448821 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdb-rundir\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.448877 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovs-rundir\") pod \"a490a71b-c33d-4c94-9592-f97d1d315e81\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.448909 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-combined-ca-bundle\") pod \"0edf6707-14dd-4986-8d64-0e48a31d6a39\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.448943 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdkbw\" (UniqueName: \"kubernetes.io/projected/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-kube-api-access-vdkbw\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449008 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-metrics-certs-tls-certs\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449042 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-scripts\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449069 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config\") pod \"0edf6707-14dd-4986-8d64-0e48a31d6a39\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run-ovn\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449204 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/220011f2-8778-4a14-82d4-33a07bd33379-scripts\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449231 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config-secret\") pod \"0edf6707-14dd-4986-8d64-0e48a31d6a39\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449257 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmhcm\" (UniqueName: \"kubernetes.io/projected/a490a71b-c33d-4c94-9592-f97d1d315e81-kube-api-access-dmhcm\") pod \"a490a71b-c33d-4c94-9592-f97d1d315e81\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449290 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkhnj\" (UniqueName: \"kubernetes.io/projected/0edf6707-14dd-4986-8d64-0e48a31d6a39-kube-api-access-xkhnj\") pod \"0edf6707-14dd-4986-8d64-0e48a31d6a39\" (UID: \"0edf6707-14dd-4986-8d64-0e48a31d6a39\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-ovn-controller-tls-certs\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-metrics-certs-tls-certs\") pod \"a490a71b-c33d-4c94-9592-f97d1d315e81\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449416 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sclc\" (UniqueName: \"kubernetes.io/projected/220011f2-8778-4a14-82d4-33a07bd33379-kube-api-access-5sclc\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449444 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-log-ovn\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449483 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdbserver-nb-tls-certs\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449513 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-combined-ca-bundle\") pod \"a490a71b-c33d-4c94-9592-f97d1d315e81\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449536 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovn-rundir\") pod \"a490a71b-c33d-4c94-9592-f97d1d315e81\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449576 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449632 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-combined-ca-bundle\") pod \"220011f2-8778-4a14-82d4-33a07bd33379\" (UID: \"220011f2-8778-4a14-82d4-33a07bd33379\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449658 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-combined-ca-bundle\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449681 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a490a71b-c33d-4c94-9592-f97d1d315e81-config\") pod \"a490a71b-c33d-4c94-9592-f97d1d315e81\" (UID: \"a490a71b-c33d-4c94-9592-f97d1d315e81\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449731 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.449779 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-config\") pod \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\" (UID: \"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.450975 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fqp8\" (UniqueName: \"kubernetes.io/projected/fddd5e59-3124-4a05-aafd-92d6aea05f7e-kube-api-access-6fqp8\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.451022 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.451032 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.451098 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 27 15:31:29 crc kubenswrapper[4772]: E0127 15:31:29.451148 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data podName:508c3d5b-212a-46da-9a55-de3f35d7019b nodeName:}" failed. No retries permitted until 2026-01-27 15:31:31.451130728 +0000 UTC m=+1477.431739826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data") pod "rabbitmq-server-0" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b") : configmap "rabbitmq-config-data" not found Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.451346 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-config" (OuterVolumeSpecName: "config") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.452912 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "a490a71b-c33d-4c94-9592-f97d1d315e81" (UID: "a490a71b-c33d-4c94-9592-f97d1d315e81"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.453417 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run" (OuterVolumeSpecName: "var-run") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.454374 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fddd5e59-3124-4a05-aafd-92d6aea05f7e" (UID: "fddd5e59-3124-4a05-aafd-92d6aea05f7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.459631 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a490a71b-c33d-4c94-9592-f97d1d315e81-config" (OuterVolumeSpecName: "config") pod "a490a71b-c33d-4c94-9592-f97d1d315e81" (UID: "a490a71b-c33d-4c94-9592-f97d1d315e81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.463813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.464092 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "a490a71b-c33d-4c94-9592-f97d1d315e81" (UID: "a490a71b-c33d-4c94-9592-f97d1d315e81"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.464335 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-scripts" (OuterVolumeSpecName: "scripts") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.464675 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.466829 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.467116 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220011f2-8778-4a14-82d4-33a07bd33379-scripts" (OuterVolumeSpecName: "scripts") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.472588 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.475632 4772 generic.go:334] "Generic (PLEG): container finished" podID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerID="7343cd6a2a5cf705b558b4cc862d749d392235682218489d0106143cb8a5d4bc" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.476253 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220011f2-8778-4a14-82d4-33a07bd33379-kube-api-access-5sclc" (OuterVolumeSpecName: "kube-api-access-5sclc") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "kube-api-access-5sclc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.475847 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7","Type":"ContainerDied","Data":"7343cd6a2a5cf705b558b4cc862d749d392235682218489d0106143cb8a5d4bc"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.481416 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a490a71b-c33d-4c94-9592-f97d1d315e81-kube-api-access-dmhcm" (OuterVolumeSpecName: "kube-api-access-dmhcm") pod "a490a71b-c33d-4c94-9592-f97d1d315e81" (UID: "a490a71b-c33d-4c94-9592-f97d1d315e81"). InnerVolumeSpecName "kube-api-access-dmhcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.483539 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-kube-api-access-vdkbw" (OuterVolumeSpecName: "kube-api-access-vdkbw") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "kube-api-access-vdkbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.485398 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dc34a3a4-ad0b-4154-82c9-728227b19732/ovsdbserver-sb/0.log" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.485542 4772 generic.go:334] "Generic (PLEG): container finished" podID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerID="fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed" exitCode=2 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.485662 4772 generic.go:334] "Generic (PLEG): container finished" podID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerID="abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.485801 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc34a3a4-ad0b-4154-82c9-728227b19732","Type":"ContainerDied","Data":"fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.485914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dc34a3a4-ad0b-4154-82c9-728227b19732","Type":"ContainerDied","Data":"abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.486012 4772 scope.go:117] "RemoveContainer" containerID="fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.486317 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.492985 4772 generic.go:334] "Generic (PLEG): container finished" podID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerID="112ddc6068b3694383f83c1ffece42788a7623920d1c02ff9f46202f7c8c0d7e" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.493062 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"683f458e-44e9-49ea-a66b-4ac91a3f2bc1","Type":"ContainerDied","Data":"112ddc6068b3694383f83c1ffece42788a7623920d1c02ff9f46202f7c8c0d7e"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.495400 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.499169 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edf6707-14dd-4986-8d64-0e48a31d6a39-kube-api-access-xkhnj" (OuterVolumeSpecName: "kube-api-access-xkhnj") pod "0edf6707-14dd-4986-8d64-0e48a31d6a39" (UID: "0edf6707-14dd-4986-8d64-0e48a31d6a39"). InnerVolumeSpecName "kube-api-access-xkhnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.501784 4772 generic.go:334] "Generic (PLEG): container finished" podID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerID="b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.501872 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6748df9c8c-zk7zp" event={"ID":"710edaa6-ba83-4b1f-a49a-769ca1911c9b","Type":"ContainerDied","Data":"b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.511019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qmppl" event={"ID":"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6","Type":"ContainerStarted","Data":"3cdc8204c9c28616053d96ae2843e1dddf8646f9b546be00bd78d90869086025"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.511074 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qmppl" event={"ID":"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6","Type":"ContainerStarted","Data":"d5deb0f3cfb55cc15d206b2ff6d6a2e4b5ccc8b9efe8608e2073fe3df0f8d559"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.522925 4772 generic.go:334] "Generic (PLEG): container finished" podID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerID="0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.522985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" event={"ID":"fddd5e59-3124-4a05-aafd-92d6aea05f7e","Type":"ContainerDied","Data":"0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.523009 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" event={"ID":"fddd5e59-3124-4a05-aafd-92d6aea05f7e","Type":"ContainerDied","Data":"c5c94af58b0cd6c043cac9ed46da0616cb74fd66aa5279858fb42cf515ba3aa1"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.523066 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-2hd4f" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.541204 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0edf6707-14dd-4986-8d64-0e48a31d6a39" (UID: "0edf6707-14dd-4986-8d64-0e48a31d6a39"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.548129 4772 generic.go:334] "Generic (PLEG): container finished" podID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerID="f10ed54f4ea68e56be83b8d8387a9768612b5c035b1fc42928132066af5bd689" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.548229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-597699949b-q6msx" event={"ID":"4205dfea-7dc7-496a-9745-fc5e3d0a418a","Type":"ContainerDied","Data":"f10ed54f4ea68e56be83b8d8387a9768612b5c035b1fc42928132066af5bd689"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556294 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-scripts\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556367 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556451 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-metrics-certs-tls-certs\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556542 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdbserver-sb-tls-certs\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556587 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-combined-ca-bundle\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556655 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8wt7\" (UniqueName: \"kubernetes.io/projected/dc34a3a4-ad0b-4154-82c9-728227b19732-kube-api-access-g8wt7\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556720 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdb-rundir\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.556738 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-config\") pod \"dc34a3a4-ad0b-4154-82c9-728227b19732\" (UID: \"dc34a3a4-ad0b-4154-82c9-728227b19732\") " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557126 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sclc\" (UniqueName: \"kubernetes.io/projected/220011f2-8778-4a14-82d4-33a07bd33379-kube-api-access-5sclc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557136 4772 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557146 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557155 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557166 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557188 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a490a71b-c33d-4c94-9592-f97d1d315e81-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557206 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557215 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557225 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557234 4772 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a490a71b-c33d-4c94-9592-f97d1d315e81-ovs-rundir\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557242 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdkbw\" (UniqueName: \"kubernetes.io/projected/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-kube-api-access-vdkbw\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557250 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557260 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557270 4772 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/220011f2-8778-4a14-82d4-33a07bd33379-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557278 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557286 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/220011f2-8778-4a14-82d4-33a07bd33379-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557294 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmhcm\" (UniqueName: \"kubernetes.io/projected/a490a71b-c33d-4c94-9592-f97d1d315e81-kube-api-access-dmhcm\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.557303 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkhnj\" (UniqueName: \"kubernetes.io/projected/0edf6707-14dd-4986-8d64-0e48a31d6a39-kube-api-access-xkhnj\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.565350 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-config" (OuterVolumeSpecName: "config") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.566359 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.567042 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-scripts" (OuterVolumeSpecName: "scripts") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.568819 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.574387 4772 generic.go:334] "Generic (PLEG): container finished" podID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.574464 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerDied","Data":"4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.576824 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ce27714-673f-47de-acc3-b6902b534bdd" containerID="f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.576942 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-556764fb84-r628x" event={"ID":"4ce27714-673f-47de-acc3-b6902b534bdd","Type":"ContainerDied","Data":"f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.598532 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc34a3a4-ad0b-4154-82c9-728227b19732-kube-api-access-g8wt7" (OuterVolumeSpecName: "kube-api-access-g8wt7") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "kube-api-access-g8wt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.604503 4772 generic.go:334] "Generic (PLEG): container finished" podID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerID="23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.604629 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-659485ddbb-5bnzg" event={"ID":"766c2a26-46ea-41b2-ba0c-2101ec9477d5","Type":"ContainerDied","Data":"23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.630599 4772 generic.go:334] "Generic (PLEG): container finished" podID="be772158-a71c-448d-8972-014f0d3a9ab8" containerID="26cc6d1f580535edc969fb0f7d0d2e7d716fa8450f944ca1657554f90801529b" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.630678 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"be772158-a71c-448d-8972-014f0d3a9ab8","Type":"ContainerDied","Data":"26cc6d1f580535edc969fb0f7d0d2e7d716fa8450f944ca1657554f90801529b"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.631833 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a490a71b-c33d-4c94-9592-f97d1d315e81" (UID: "a490a71b-c33d-4c94-9592-f97d1d315e81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.660297 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.660662 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8wt7\" (UniqueName: \"kubernetes.io/projected/dc34a3a4-ad0b-4154-82c9-728227b19732-kube-api-access-g8wt7\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.660679 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.660696 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.660707 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dc34a3a4-ad0b-4154-82c9-728227b19732-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.660733 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.668702 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2/ovsdbserver-nb/0.log" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.668753 4772 generic.go:334] "Generic (PLEG): container finished" podID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerID="abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.668826 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2","Type":"ContainerDied","Data":"abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.668853 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2","Type":"ContainerDied","Data":"2d9f9f123f138892540800ef23f48dae96e200e8a0b42b345d3f87addf089f7e"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.668920 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.691588 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729347 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="8d889567d10b3e8868d76680ff442da2a14216919aae766c356918ec9960b9a4" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729533 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="c1cf3012e8501ba3a809e028a1ab49c960d95fb090a04b4dbca6cd01d2de9524" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729586 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="b0a7c137687a720a7d8c3f84cc586f4b9d3bde7c9bc9e2e0c83a325c2ae23322" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729633 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="8bbb31c1be222187b0e9b27f07c1ac0fe66d8ad583df4ff6b26fec62ab98cf87" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729678 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="71b4242b9081be055bfb8bd2db6959d32259cd0c3ee2b95ddde1c1d2154be74b" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729757 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="bc57f117c387fb10832190ea21f63cdb319308d9390292395fb515e28966d217" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729807 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="99c9f47c0720632dfecbfc5e9152885ab96d751677b561767c79f0a032ca5cf5" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729859 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="0c6f6ecf89a4947c23560538762ca73dfe5e13c4acb04e206d91772a3cfc9c49" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729940 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="94e4c588a745acb16ce919a52f7150cf54119c1c41e94c9e658206e6b58958ed" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.729991 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="494d3ebaeddb756bf375d2bc394a4b4086ee3e25d9a76747552d41c1f40a9737" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730039 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="ac32767b3784713a66fbfe32a337398a7461aa8ffad58bbfea7ccf6e3c4ee19d" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730093 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="c3f602f5b8fe5f978c40989adc1d0130c6aaae0dce0fc13d5e34bbe819e8eccb" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730140 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="5f271cd2dcb6b658cde722402c5b2945c28f4d7486cab8c56e064081779416a1" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730208 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="d35aa807e61d39133b8319305719556fcfa6889495c80253864eaf2dc48a450b" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730300 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"8d889567d10b3e8868d76680ff442da2a14216919aae766c356918ec9960b9a4"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"c1cf3012e8501ba3a809e028a1ab49c960d95fb090a04b4dbca6cd01d2de9524"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730436 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"b0a7c137687a720a7d8c3f84cc586f4b9d3bde7c9bc9e2e0c83a325c2ae23322"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730490 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"8bbb31c1be222187b0e9b27f07c1ac0fe66d8ad583df4ff6b26fec62ab98cf87"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730540 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"71b4242b9081be055bfb8bd2db6959d32259cd0c3ee2b95ddde1c1d2154be74b"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"bc57f117c387fb10832190ea21f63cdb319308d9390292395fb515e28966d217"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730641 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"99c9f47c0720632dfecbfc5e9152885ab96d751677b561767c79f0a032ca5cf5"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"0c6f6ecf89a4947c23560538762ca73dfe5e13c4acb04e206d91772a3cfc9c49"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730755 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"94e4c588a745acb16ce919a52f7150cf54119c1c41e94c9e658206e6b58958ed"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730807 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"494d3ebaeddb756bf375d2bc394a4b4086ee3e25d9a76747552d41c1f40a9737"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730857 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"ac32767b3784713a66fbfe32a337398a7461aa8ffad58bbfea7ccf6e3c4ee19d"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730907 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"c3f602f5b8fe5f978c40989adc1d0130c6aaae0dce0fc13d5e34bbe819e8eccb"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.730956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"5f271cd2dcb6b658cde722402c5b2945c28f4d7486cab8c56e064081779416a1"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.731010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"d35aa807e61d39133b8319305719556fcfa6889495c80253864eaf2dc48a450b"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.745647 4772 generic.go:334] "Generic (PLEG): container finished" podID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerID="db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.745742 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93c8f9a4-c6ef-42b8-8543-ff8b5347977e","Type":"ContainerDied","Data":"db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.755750 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.760153 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.766631 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-gxjzh" event={"ID":"220011f2-8778-4a14-82d4-33a07bd33379","Type":"ContainerDied","Data":"3cb1a1a1b7113cd35f8e36164d72f2c95422afc48122fb52f34747897808a62b"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.766672 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-gxjzh" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.779143 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.780339 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.780453 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.783325 4772 generic.go:334] "Generic (PLEG): container finished" podID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerID="b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2" exitCode=0 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.783472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647c88bb6f-wzf82" event={"ID":"6cf131c4-a5bd-452b-8598-42312c3a0270","Type":"ContainerDied","Data":"b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.788939 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.792878 4772 generic.go:334] "Generic (PLEG): container finished" podID="9a02b617-28a7-4262-a110-f1c71763ad19" containerID="d767e789b4befb7b8caac693075691222c00bb6ae1189417345706dad41621f9" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.792968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a02b617-28a7-4262-a110-f1c71763ad19","Type":"ContainerDied","Data":"d767e789b4befb7b8caac693075691222c00bb6ae1189417345706dad41621f9"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.798439 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fddd5e59-3124-4a05-aafd-92d6aea05f7e" (UID: "fddd5e59-3124-4a05-aafd-92d6aea05f7e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.801663 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vqpfg_a490a71b-c33d-4c94-9592-f97d1d315e81/openstack-network-exporter/0.log" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.801922 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vqpfg" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.802230 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vqpfg" event={"ID":"a490a71b-c33d-4c94-9592-f97d1d315e81","Type":"ContainerDied","Data":"b17e4736d7a1350c0c68f20fb3327f8519a43cb3b16a163a3b8e79d710328aca"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.827108 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0edf6707-14dd-4986-8d64-0e48a31d6a39" (UID: "0edf6707-14dd-4986-8d64-0e48a31d6a39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.837237 4772 generic.go:334] "Generic (PLEG): container finished" podID="0edf6707-14dd-4986-8d64-0e48a31d6a39" containerID="0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa" exitCode=137 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.837370 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.851928 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-qmppl" podStartSLOduration=3.851904759 podStartE2EDuration="3.851904759s" podCreationTimestamp="2026-01-27 15:31:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:31:29.531512051 +0000 UTC m=+1475.512121149" watchObservedRunningTime="2026-01-27 15:31:29.851904759 +0000 UTC m=+1475.832513867" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.870643 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97c3-account-create-update-bvlvs"] Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.882379 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.882450 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.882463 4772 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.887763 4772 generic.go:334] "Generic (PLEG): container finished" podID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerID="3454f9899adaff309b52934e71697924735c1f269fb473444cba03b5baf4e1e5" exitCode=143 Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.887816 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343","Type":"ContainerDied","Data":"3454f9899adaff309b52934e71697924735c1f269fb473444cba03b5baf4e1e5"} Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.894025 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0edf6707-14dd-4986-8d64-0e48a31d6a39" (UID: "0edf6707-14dd-4986-8d64-0e48a31d6a39"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.894694 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "220011f2-8778-4a14-82d4-33a07bd33379" (UID: "220011f2-8778-4a14-82d4-33a07bd33379"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.984761 4772 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0edf6707-14dd-4986-8d64-0e48a31d6a39-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:29 crc kubenswrapper[4772]: I0127 15:31:29.984797 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/220011f2-8778-4a14-82d4-33a07bd33379-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.009094 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.025402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" (UID: "4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.031284 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-config" (OuterVolumeSpecName: "config") pod "fddd5e59-3124-4a05-aafd-92d6aea05f7e" (UID: "fddd5e59-3124-4a05-aafd-92d6aea05f7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.035357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.043247 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "dc34a3a4-ad0b-4154-82c9-728227b19732" (UID: "dc34a3a4-ad0b-4154-82c9-728227b19732"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.068388 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.078442 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.082364 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "a490a71b-c33d-4c94-9592-f97d1d315e81" (UID: "a490a71b-c33d-4c94-9592-f97d1d315e81"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.095497 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a490a71b-c33d-4c94-9592-f97d1d315e81-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.095551 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.095567 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.095580 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc34a3a4-ad0b-4154-82c9-728227b19732-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.095595 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fddd5e59-3124-4a05-aafd-92d6aea05f7e-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.095607 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.102769 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.102866 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" containerName="nova-scheduler-scheduler" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.210671 4772 scope.go:117] "RemoveContainer" containerID="abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.252545 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d86f6cfbc-cwfmc"] Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.253278 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-httpd" containerID="cri-o://a476d84a3741734575b073569a645d9d973c5cdbb39812aa454a7257859db22b" gracePeriod=30 Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.253478 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-server" containerID="cri-o://47a1d8c4913044388b407e6a5c05783d2d3731216d7862873425d28265a5fe05" gracePeriod=30 Jan 27 15:31:30 crc kubenswrapper[4772]: W0127 15:31:30.526841 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef060591_3809_4f0b_974f_0785261db9b9.slice/crio-818d39110910f066b35d697f63a51f4883012a3f77256ac3126d09653c3a60e2 WatchSource:0}: Error finding container 818d39110910f066b35d697f63a51f4883012a3f77256ac3126d09653c3a60e2: Status 404 returned error can't find the container with id 818d39110910f066b35d697f63a51f4883012a3f77256ac3126d09653c3a60e2 Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.531467 4772 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 27 15:31:30 crc kubenswrapper[4772]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Jan 27 15:31:30 crc kubenswrapper[4772]: Jan 27 15:31:30 crc kubenswrapper[4772]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Jan 27 15:31:30 crc kubenswrapper[4772]: Jan 27 15:31:30 crc kubenswrapper[4772]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Jan 27 15:31:30 crc kubenswrapper[4772]: Jan 27 15:31:30 crc kubenswrapper[4772]: MYSQL_CMD="mysql -h -u root -P 3306" Jan 27 15:31:30 crc kubenswrapper[4772]: Jan 27 15:31:30 crc kubenswrapper[4772]: if [ -n "barbican" ]; then Jan 27 15:31:30 crc kubenswrapper[4772]: GRANT_DATABASE="barbican" Jan 27 15:31:30 crc kubenswrapper[4772]: else Jan 27 15:31:30 crc kubenswrapper[4772]: GRANT_DATABASE="*" Jan 27 15:31:30 crc kubenswrapper[4772]: fi Jan 27 15:31:30 crc kubenswrapper[4772]: Jan 27 15:31:30 crc kubenswrapper[4772]: # going for maximum compatibility here: Jan 27 15:31:30 crc kubenswrapper[4772]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Jan 27 15:31:30 crc kubenswrapper[4772]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Jan 27 15:31:30 crc kubenswrapper[4772]: # 3. create user with CREATE but then do all password and TLS with ALTER to Jan 27 15:31:30 crc kubenswrapper[4772]: # support updates Jan 27 15:31:30 crc kubenswrapper[4772]: Jan 27 15:31:30 crc kubenswrapper[4772]: $MYSQL_CMD < logger="UnhandledError" Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.532873 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-97c3-account-create-update-bvlvs" podUID="ef060591-3809-4f0b-974f-0785261db9b9" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.675002 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="5e69643a-e8c2-4057-a993-d5506ceeec1b" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.200:6080/vnc_lite.html\": dial tcp 10.217.0.200:6080: connect: connection refused" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.715344 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d7e14a-70d3-446e-8250-ca1047b5bc4b" path="/var/lib/kubelet/pods/08d7e14a-70d3-446e-8250-ca1047b5bc4b/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.716445 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edf6707-14dd-4986-8d64-0e48a31d6a39" path="/var/lib/kubelet/pods/0edf6707-14dd-4986-8d64-0e48a31d6a39/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.717344 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b68551f-119d-4d84-9c91-20e013018b7a" path="/var/lib/kubelet/pods/2b68551f-119d-4d84-9c91-20e013018b7a/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.718852 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54bbbf38-088b-4e4d-8154-569667fcf9a9" path="/var/lib/kubelet/pods/54bbbf38-088b-4e4d-8154-569667fcf9a9/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.719279 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.719399 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data podName:76fdbdb1-d48a-4cd1-8372-78887671dce8 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:34.719369878 +0000 UTC m=+1480.699978976 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data") pod "rabbitmq-cell1-server-0" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8") : configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.719505 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7907cc16-7665-49d3-ad17-f9e6e0fc2f09" path="/var/lib/kubelet/pods/7907cc16-7665-49d3-ad17-f9e6e0fc2f09/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.720146 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be888039-f158-4d05-9f7d-6d01b2478b08" path="/var/lib/kubelet/pods/be888039-f158-4d05-9f7d-6d01b2478b08/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.721026 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112" path="/var/lib/kubelet/pods/ee9c9aa3-63e7-49ae-b3f3-f9bc0802f112/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.722379 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91bfd1b-6386-444f-95da-045fbe957f5c" path="/var/lib/kubelet/pods/f91bfd1b-6386-444f-95da-045fbe957f5c/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.723127 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe34fbf1-61c4-46a9-9954-64ed431d2cb7" path="/var/lib/kubelet/pods/fe34fbf1-61c4-46a9-9954-64ed431d2cb7/volumes" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.880978 4772 scope.go:117] "RemoveContainer" containerID="fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed" Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.881434 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed\": container with ID starting with fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed not found: ID does not exist" containerID="fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.881480 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed"} err="failed to get container status \"fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed\": rpc error: code = NotFound desc = could not find container \"fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed\": container with ID starting with fc9316436d0d6826797760ceeb255662e4eca1649200864d00e921be6f2e6eed not found: ID does not exist" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.881507 4772 scope.go:117] "RemoveContainer" containerID="abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593" Jan 27 15:31:30 crc kubenswrapper[4772]: E0127 15:31:30.881860 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593\": container with ID starting with abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593 not found: ID does not exist" containerID="abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.881884 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593"} err="failed to get container status \"abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593\": rpc error: code = NotFound desc = could not find container \"abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593\": container with ID starting with abdb27873ea97363386820f9e29ffa55e1d51fea483628db2a98ad8d2f8fc593 not found: ID does not exist" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.881897 4772 scope.go:117] "RemoveContainer" containerID="0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.887050 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.904755 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-gxjzh"] Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.914591 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-gxjzh"] Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.926491 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.930128 4772 generic.go:334] "Generic (PLEG): container finished" podID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerID="3cdc8204c9c28616053d96ae2843e1dddf8646f9b546be00bd78d90869086025" exitCode=1 Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.930207 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qmppl" event={"ID":"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6","Type":"ContainerDied","Data":"3cdc8204c9c28616053d96ae2843e1dddf8646f9b546be00bd78d90869086025"} Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.930316 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.931335 4772 scope.go:117] "RemoveContainer" containerID="3cdc8204c9c28616053d96ae2843e1dddf8646f9b546be00bd78d90869086025" Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.934925 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.941448 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97c3-account-create-update-bvlvs" event={"ID":"ef060591-3809-4f0b-974f-0785261db9b9","Type":"ContainerStarted","Data":"818d39110910f066b35d697f63a51f4883012a3f77256ac3126d09653c3a60e2"} Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.958901 4772 generic.go:334] "Generic (PLEG): container finished" podID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerID="2e743dfaa62b788cb68a4d553d64cf9affaf8ef6e4da1308fddf4dc259167b69" exitCode=0 Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.958991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf619242-7348-4de4-a37e-8ebdc4ca54d7","Type":"ContainerDied","Data":"2e743dfaa62b788cb68a4d553d64cf9affaf8ef6e4da1308fddf4dc259167b69"} Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.987105 4772 generic.go:334] "Generic (PLEG): container finished" podID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerID="a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546" exitCode=0 Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.987192 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6748df9c8c-zk7zp" event={"ID":"710edaa6-ba83-4b1f-a49a-769ca1911c9b","Type":"ContainerDied","Data":"a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546"} Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.987221 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6748df9c8c-zk7zp" event={"ID":"710edaa6-ba83-4b1f-a49a-769ca1911c9b","Type":"ContainerDied","Data":"c83991847bf683630e70d44722d44695d9152a02d09f0a3d6fe39436ebbf262d"} Jan 27 15:31:30 crc kubenswrapper[4772]: I0127 15:31:30.987287 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6748df9c8c-zk7zp" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.004871 4772 generic.go:334] "Generic (PLEG): container finished" podID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerID="47a1d8c4913044388b407e6a5c05783d2d3731216d7862873425d28265a5fe05" exitCode=0 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.004916 4772 generic.go:334] "Generic (PLEG): container finished" podID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerID="a476d84a3741734575b073569a645d9d973c5cdbb39812aa454a7257859db22b" exitCode=0 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.004915 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" event={"ID":"c16a29a0-7238-4a5e-b892-8f5195a1f486","Type":"ContainerDied","Data":"47a1d8c4913044388b407e6a5c05783d2d3731216d7862873425d28265a5fe05"} Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.005206 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" event={"ID":"c16a29a0-7238-4a5e-b892-8f5195a1f486","Type":"ContainerDied","Data":"a476d84a3741734575b073569a645d9d973c5cdbb39812aa454a7257859db22b"} Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.012641 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.016336 4772 scope.go:117] "RemoveContainer" containerID="71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.029131 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data\") pod \"4ce27714-673f-47de-acc3-b6902b534bdd\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.029934 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-combined-ca-bundle\") pod \"4ce27714-673f-47de-acc3-b6902b534bdd\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.030103 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce27714-673f-47de-acc3-b6902b534bdd-logs\") pod \"4ce27714-673f-47de-acc3-b6902b534bdd\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.030241 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmmsl\" (UniqueName: \"kubernetes.io/projected/4ce27714-673f-47de-acc3-b6902b534bdd-kube-api-access-pmmsl\") pod \"4ce27714-673f-47de-acc3-b6902b534bdd\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.030525 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvs49\" (UniqueName: \"kubernetes.io/projected/710edaa6-ba83-4b1f-a49a-769ca1911c9b-kube-api-access-nvs49\") pod \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.030670 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data-custom\") pod \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.030831 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data-custom\") pod \"4ce27714-673f-47de-acc3-b6902b534bdd\" (UID: \"4ce27714-673f-47de-acc3-b6902b534bdd\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.030959 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data\") pod \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.031566 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-combined-ca-bundle\") pod \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.032017 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710edaa6-ba83-4b1f-a49a-769ca1911c9b-logs\") pod \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\" (UID: \"710edaa6-ba83-4b1f-a49a-769ca1911c9b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.033591 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce27714-673f-47de-acc3-b6902b534bdd-logs" (OuterVolumeSpecName: "logs") pod "4ce27714-673f-47de-acc3-b6902b534bdd" (UID: "4ce27714-673f-47de-acc3-b6902b534bdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.034703 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/710edaa6-ba83-4b1f-a49a-769ca1911c9b-logs" (OuterVolumeSpecName: "logs") pod "710edaa6-ba83-4b1f-a49a-769ca1911c9b" (UID: "710edaa6-ba83-4b1f-a49a-769ca1911c9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.038076 4772 generic.go:334] "Generic (PLEG): container finished" podID="4ce27714-673f-47de-acc3-b6902b534bdd" containerID="aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc" exitCode=0 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.038195 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-556764fb84-r628x" event={"ID":"4ce27714-673f-47de-acc3-b6902b534bdd","Type":"ContainerDied","Data":"aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc"} Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.038542 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "710edaa6-ba83-4b1f-a49a-769ca1911c9b" (UID: "710edaa6-ba83-4b1f-a49a-769ca1911c9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.038727 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-556764fb84-r628x" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.040306 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-556764fb84-r628x" event={"ID":"4ce27714-673f-47de-acc3-b6902b534bdd","Type":"ContainerDied","Data":"51e9e5e71be46820f9c3d1564ff14b9e6df8988ed057a1326779c07f7fee3331"} Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.040446 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710edaa6-ba83-4b1f-a49a-769ca1911c9b-kube-api-access-nvs49" (OuterVolumeSpecName: "kube-api-access-nvs49") pod "710edaa6-ba83-4b1f-a49a-769ca1911c9b" (UID: "710edaa6-ba83-4b1f-a49a-769ca1911c9b"). InnerVolumeSpecName "kube-api-access-nvs49". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.041347 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce27714-673f-47de-acc3-b6902b534bdd-kube-api-access-pmmsl" (OuterVolumeSpecName: "kube-api-access-pmmsl") pod "4ce27714-673f-47de-acc3-b6902b534bdd" (UID: "4ce27714-673f-47de-acc3-b6902b534bdd"). InnerVolumeSpecName "kube-api-access-pmmsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.045304 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4ce27714-673f-47de-acc3-b6902b534bdd" (UID: "4ce27714-673f-47de-acc3-b6902b534bdd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.058976 4772 generic.go:334] "Generic (PLEG): container finished" podID="5e69643a-e8c2-4057-a993-d5506ceeec1b" containerID="7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce" exitCode=0 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.059023 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e69643a-e8c2-4057-a993-d5506ceeec1b","Type":"ContainerDied","Data":"7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce"} Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.088759 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e69643a_e8c2_4057_a993_d5506ceeec1b.slice/crio-conmon-7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e69643a_e8c2_4057_a993_d5506ceeec1b.slice/crio-7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4270ab9b_f4a9_4d48_9cc2_f25152ee5fb2.slice/crio-2d9f9f123f138892540800ef23f48dae96e200e8a0b42b345d3f87addf089f7e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda490a71b_c33d_4c94_9592_f97d1d315e81.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edf6707_14dd_4986_8d64_0e48a31d6a39.slice/crio-5c52f5cf3b82427db2a187bbd0708a64e4f14f826b96324500c229ad2e72a4cf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf619242_7348_4de4_a37e_8ebdc4ca54d7.slice/crio-conmon-2e743dfaa62b788cb68a4d553d64cf9affaf8ef6e4da1308fddf4dc259167b69.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda490a71b_c33d_4c94_9592_f97d1d315e81.slice/crio-b17e4736d7a1350c0c68f20fb3327f8519a43cb3b16a163a3b8e79d710328aca\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edf6707_14dd_4986_8d64_0e48a31d6a39.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4270ab9b_f4a9_4d48_9cc2_f25152ee5fb2.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfddd5e59_3124_4a05_aafd_92d6aea05f7e.slice/crio-c5c94af58b0cd6c043cac9ed46da0616cb74fd66aa5279858fb42cf515ba3aa1\": RecentStats: unable to find data in memory cache]" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.112775 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.120384 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.129443 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-2hd4f"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.134221 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kolla-config\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.134405 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.134541 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhlp\" (UniqueName: \"kubernetes.io/projected/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kube-api-access-kvhlp\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.134651 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-generated\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.134801 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-default\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.134940 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-galera-tls-certs\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.135079 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-operator-scripts\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.135222 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-combined-ca-bundle\") pod \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\" (UID: \"cf619242-7348-4de4-a37e-8ebdc4ca54d7\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.135884 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.135987 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/710edaa6-ba83-4b1f-a49a-769ca1911c9b-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.136067 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ce27714-673f-47de-acc3-b6902b534bdd-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.136126 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmmsl\" (UniqueName: \"kubernetes.io/projected/4ce27714-673f-47de-acc3-b6902b534bdd-kube-api-access-pmmsl\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.136201 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvs49\" (UniqueName: \"kubernetes.io/projected/710edaa6-ba83-4b1f-a49a-769ca1911c9b-kube-api-access-nvs49\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.136256 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.136995 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.137414 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.137587 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.140305 4772 scope.go:117] "RemoveContainer" containerID="0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.140888 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86\": container with ID starting with 0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86 not found: ID does not exist" containerID="0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.140929 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86"} err="failed to get container status \"0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86\": rpc error: code = NotFound desc = could not find container \"0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86\": container with ID starting with 0ee5661567fa3ca13869262f2ac472811c7a59976cae6fbfe300747e324b4e86 not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.140958 4772 scope.go:117] "RemoveContainer" containerID="71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.141034 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.141193 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.141536 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500\": container with ID starting with 71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500 not found: ID does not exist" containerID="71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.141578 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500"} err="failed to get container status \"71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500\": rpc error: code = NotFound desc = could not find container \"71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500\": container with ID starting with 71013e440a971fe3ef401d90a82408249df3d1180b65da0eb4683442d6023500 not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.141607 4772 scope.go:117] "RemoveContainer" containerID="9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.144196 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-2hd4f"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.154711 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-vqpfg"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.155362 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kube-api-access-kvhlp" (OuterVolumeSpecName: "kube-api-access-kvhlp") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "kube-api-access-kvhlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.168053 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data" (OuterVolumeSpecName: "config-data") pod "4ce27714-673f-47de-acc3-b6902b534bdd" (UID: "4ce27714-673f-47de-acc3-b6902b534bdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.170230 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-vqpfg"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.192429 4772 scope.go:117] "RemoveContainer" containerID="abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.194643 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ce27714-673f-47de-acc3-b6902b534bdd" (UID: "4ce27714-673f-47de-acc3-b6902b534bdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.221592 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.237807 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ttx\" (UniqueName: \"kubernetes.io/projected/5e69643a-e8c2-4057-a993-d5506ceeec1b-kube-api-access-85ttx\") pod \"5e69643a-e8c2-4057-a993-d5506ceeec1b\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.237923 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-nova-novncproxy-tls-certs\") pod \"5e69643a-e8c2-4057-a993-d5506ceeec1b\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.238039 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-config-data\") pod \"5e69643a-e8c2-4057-a993-d5506ceeec1b\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.238094 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-vencrypt-tls-certs\") pod \"5e69643a-e8c2-4057-a993-d5506ceeec1b\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.238198 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-combined-ca-bundle\") pod \"5e69643a-e8c2-4057-a993-d5506ceeec1b\" (UID: \"5e69643a-e8c2-4057-a993-d5506ceeec1b\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239373 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239402 4772 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239438 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239459 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvhlp\" (UniqueName: \"kubernetes.io/projected/cf619242-7348-4de4-a37e-8ebdc4ca54d7-kube-api-access-kvhlp\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239475 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239490 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239501 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cf619242-7348-4de4-a37e-8ebdc4ca54d7-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.239513 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ce27714-673f-47de-acc3-b6902b534bdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.259042 4772 scope.go:117] "RemoveContainer" containerID="9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.261453 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4\": container with ID starting with 9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4 not found: ID does not exist" containerID="9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.261504 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4"} err="failed to get container status \"9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4\": rpc error: code = NotFound desc = could not find container \"9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4\": container with ID starting with 9ab2ac6bce7a8071ec2b4cecbc76933f6c63344bca73557900280dd89a9b1ef4 not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.261532 4772 scope.go:117] "RemoveContainer" containerID="abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.262109 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c\": container with ID starting with abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c not found: ID does not exist" containerID="abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.262218 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c"} err="failed to get container status \"abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c\": rpc error: code = NotFound desc = could not find container \"abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c\": container with ID starting with abb84f069b7ba6556a04c96fbef42abc5bac570c75f402b32a5f9f20ac96046c not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.262312 4772 scope.go:117] "RemoveContainer" containerID="afc8ab10fea0840566de64c53bc97d22454ee25e120ead660e5999b0da009daf" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.274895 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e69643a-e8c2-4057-a993-d5506ceeec1b-kube-api-access-85ttx" (OuterVolumeSpecName: "kube-api-access-85ttx") pod "5e69643a-e8c2-4057-a993-d5506ceeec1b" (UID: "5e69643a-e8c2-4057-a993-d5506ceeec1b"). InnerVolumeSpecName "kube-api-access-85ttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.290129 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "710edaa6-ba83-4b1f-a49a-769ca1911c9b" (UID: "710edaa6-ba83-4b1f-a49a-769ca1911c9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.294293 4772 scope.go:117] "RemoveContainer" containerID="b93ad84c922746d427d3e2a2deb04a875a239fcafbecb5146ae05b1b11e36a09" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.318711 4772 scope.go:117] "RemoveContainer" containerID="0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.332206 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.343332 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.343366 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85ttx\" (UniqueName: \"kubernetes.io/projected/5e69643a-e8c2-4057-a993-d5506ceeec1b-kube-api-access-85ttx\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.385328 4772 scope.go:117] "RemoveContainer" containerID="0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.386732 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa\": container with ID starting with 0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa not found: ID does not exist" containerID="0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.387007 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa"} err="failed to get container status \"0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa\": rpc error: code = NotFound desc = could not find container \"0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa\": container with ID starting with 0c37dc673e475cc4ca1e8b831b0543b26650ceedc799dace964e07fb4c7c7ffa not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.387040 4772 scope.go:117] "RemoveContainer" containerID="a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.407473 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-556764fb84-r628x"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.410224 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.412245 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-556764fb84-r628x"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.412332 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.421262 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data" (OuterVolumeSpecName: "config-data") pod "710edaa6-ba83-4b1f-a49a-769ca1911c9b" (UID: "710edaa6-ba83-4b1f-a49a-769ca1911c9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.428638 4772 scope.go:117] "RemoveContainer" containerID="b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.439259 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444010 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-log-httpd\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444157 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-etc-swift\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444257 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-config-data\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444303 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-combined-ca-bundle\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-run-httpd\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444386 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-internal-tls-certs\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444470 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfzf2\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-kube-api-access-pfzf2\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444504 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-public-tls-certs\") pod \"c16a29a0-7238-4a5e-b892-8f5195a1f486\" (UID: \"c16a29a0-7238-4a5e-b892-8f5195a1f486\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444928 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444948 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/710edaa6-ba83-4b1f-a49a-769ca1911c9b-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.444960 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.446497 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "cf619242-7348-4de4-a37e-8ebdc4ca54d7" (UID: "cf619242-7348-4de4-a37e-8ebdc4ca54d7"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.446944 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.447507 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.448121 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-config-data" (OuterVolumeSpecName: "config-data") pod "5e69643a-e8c2-4057-a993-d5506ceeec1b" (UID: "5e69643a-e8c2-4057-a993-d5506ceeec1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.462695 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e69643a-e8c2-4057-a993-d5506ceeec1b" (UID: "5e69643a-e8c2-4057-a993-d5506ceeec1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.468665 4772 scope.go:117] "RemoveContainer" containerID="a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.468835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-kube-api-access-pfzf2" (OuterVolumeSpecName: "kube-api-access-pfzf2") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "kube-api-access-pfzf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.469117 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.469319 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546\": container with ID starting with a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546 not found: ID does not exist" containerID="a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.469354 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546"} err="failed to get container status \"a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546\": rpc error: code = NotFound desc = could not find container \"a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546\": container with ID starting with a063d80b4cd5f0199157f5e139c54f744514f0203001f47e4bce93805443a546 not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.469373 4772 scope.go:117] "RemoveContainer" containerID="b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.469871 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a\": container with ID starting with b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a not found: ID does not exist" containerID="b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.469892 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a"} err="failed to get container status \"b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a\": rpc error: code = NotFound desc = could not find container \"b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a\": container with ID starting with b1ca77abcb5dfa41040a6625bbdc220ae80143a0714b3ff9a856057794a6d02a not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.469905 4772 scope.go:117] "RemoveContainer" containerID="aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.524816 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.532697 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "5e69643a-e8c2-4057-a993-d5506ceeec1b" (UID: "5e69643a-e8c2-4057-a993-d5506ceeec1b"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.535939 4772 scope.go:117] "RemoveContainer" containerID="f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.545796 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j94f\" (UniqueName: \"kubernetes.io/projected/ef060591-3809-4f0b-974f-0785261db9b9-kube-api-access-2j94f\") pod \"ef060591-3809-4f0b-974f-0785261db9b9\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.548396 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "5e69643a-e8c2-4057-a993-d5506ceeec1b" (UID: "5e69643a-e8c2-4057-a993-d5506ceeec1b"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.549363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef060591-3809-4f0b-974f-0785261db9b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef060591-3809-4f0b-974f-0785261db9b9" (UID: "ef060591-3809-4f0b-974f-0785261db9b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.550013 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef060591-3809-4f0b-974f-0785261db9b9-operator-scripts\") pod \"ef060591-3809-4f0b-974f-0785261db9b9\" (UID: \"ef060591-3809-4f0b-974f-0785261db9b9\") " Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.550038 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef060591-3809-4f0b-974f-0785261db9b9-kube-api-access-2j94f" (OuterVolumeSpecName: "kube-api-access-2j94f") pod "ef060591-3809-4f0b-974f-0785261db9b9" (UID: "ef060591-3809-4f0b-974f-0785261db9b9"). InnerVolumeSpecName "kube-api-access-2j94f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.559906 4772 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf619242-7348-4de4-a37e-8ebdc4ca54d7-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560021 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560044 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfzf2\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-kube-api-access-pfzf2\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560061 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560074 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j94f\" (UniqueName: \"kubernetes.io/projected/ef060591-3809-4f0b-974f-0785261db9b9-kube-api-access-2j94f\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560085 4772 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c16a29a0-7238-4a5e-b892-8f5195a1f486-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560098 4772 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560110 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560120 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c16a29a0-7238-4a5e-b892-8f5195a1f486-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560135 4772 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e69643a-e8c2-4057-a993-d5506ceeec1b-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560146 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.560158 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef060591-3809-4f0b-974f-0785261db9b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.560262 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.560327 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data podName:508c3d5b-212a-46da-9a55-de3f35d7019b nodeName:}" failed. No retries permitted until 2026-01-27 15:31:35.56030754 +0000 UTC m=+1481.540916638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data") pod "rabbitmq-server-0" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b") : configmap "rabbitmq-config-data" not found Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.564369 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-config-data" (OuterVolumeSpecName: "config-data") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.580848 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.583726 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c16a29a0-7238-4a5e-b892-8f5195a1f486" (UID: "c16a29a0-7238-4a5e-b892-8f5195a1f486"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.668000 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.668027 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.668036 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c16a29a0-7238-4a5e-b892-8f5195a1f486-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.738266 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-597699949b-q6msx" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.152:8778/\": read tcp 10.217.0.2:56924->10.217.0.152:8778: read: connection reset by peer" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.738296 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-597699949b-q6msx" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.152:8778/\": read tcp 10.217.0.2:56916->10.217.0.152:8778: read: connection reset by peer" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.779009 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6748df9c8c-zk7zp"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.783464 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6748df9c8c-zk7zp"] Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.786484 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83 is running failed: container process not found" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.786685 4772 scope.go:117] "RemoveContainer" containerID="aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.789362 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc\": container with ID starting with aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc not found: ID does not exist" containerID="aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.789405 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc"} err="failed to get container status \"aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc\": rpc error: code = NotFound desc = could not find container \"aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc\": container with ID starting with aa76ea75f91196a6ccffd5d7e7d149b5efe900bfae2e86e19fa1ec88171321cc not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.789431 4772 scope.go:117] "RemoveContainer" containerID="f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.789517 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83 is running failed: container process not found" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.789851 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f\": container with ID starting with f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f not found: ID does not exist" containerID="f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.789869 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f"} err="failed to get container status \"f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f\": rpc error: code = NotFound desc = could not find container \"f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f\": container with ID starting with f068099f2f85afe1f1db1c1b4191de3b3198e413724471d516ae5586de30eb8f not found: ID does not exist" Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.789883 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83 is running failed: container process not found" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:31 crc kubenswrapper[4772]: E0127 15:31:31.789905 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="b20b9215-5398-4100-bac4-763daa5ed222" containerName="nova-cell0-conductor-conductor" Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.901026 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.901383 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-central-agent" containerID="cri-o://a4293d3cbd138216987430f5dab62fa26e55c56743eee0b42dd4fc7797a52afd" gracePeriod=30 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.901864 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="proxy-httpd" containerID="cri-o://0447c2ea1d147e4cee27fce146e4edc38d746774dc492452f5da3c48df7973bb" gracePeriod=30 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.901995 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="sg-core" containerID="cri-o://7b0085db2ce3021657d7773e88196b66b6759beeca3bff2b51fc3fdf5d6b4bd2" gracePeriod=30 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.902012 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-notification-agent" containerID="cri-o://81bb10c06283521cef14702be02bc4e89a7f82e4ae6c7d56b76d0d05f92797d0" gracePeriod=30 Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.917880 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:31:31 crc kubenswrapper[4772]: I0127 15:31:31.918076 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="21f54218-5889-4ae9-a7a1-7ed4895ad63c" containerName="kube-state-metrics" containerID="cri-o://670d5287e2a9882bc2137122191964eb76c57b36df9c904f50db621c1141ab98" gracePeriod=30 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.074722 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4c0e-account-create-update-w9dkg"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.104215 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-647c88bb6f-wzf82" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.167:9696/\": dial tcp 10.217.0.167:9696: connect: connection refused" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.115484 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.116414 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.127411 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.131263 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.131332 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.155484 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4c0e-account-create-update-w9dkg"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.163875 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.164148 4772 generic.go:334] "Generic (PLEG): container finished" podID="21f54218-5889-4ae9-a7a1-7ed4895ad63c" containerID="670d5287e2a9882bc2137122191964eb76c57b36df9c904f50db621c1141ab98" exitCode=2 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.164216 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"21f54218-5889-4ae9-a7a1-7ed4895ad63c","Type":"ContainerDied","Data":"670d5287e2a9882bc2137122191964eb76c57b36df9c904f50db621c1141ab98"} Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.182942 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.182998 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.218837 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:33664->10.217.0.207:8775: read: connection reset by peer" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.218844 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.207:8775/\": read tcp 10.217.0.2:33648->10.217.0.207:8775: read: connection reset by peer" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.219254 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.219437 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" containerName="memcached" containerID="cri-o://faf687181014b14838de86572705cbe5952bdabca1b3fad7e35afc3ce6238c0f" gracePeriod=30 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.231566 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4c0e-account-create-update-wlbfm"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.232111 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a490a71b-c33d-4c94-9592-f97d1d315e81" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.232197 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a490a71b-c33d-4c94-9592-f97d1d315e81" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.232392 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.232459 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.232523 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-server" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.232572 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-server" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.232649 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.232723 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.232797 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e69643a-e8c2-4057-a993-d5506ceeec1b" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.232865 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e69643a-e8c2-4057-a993-d5506ceeec1b" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.233003 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="ovsdbserver-nb" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.233070 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="ovsdbserver-nb" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.233146 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-httpd" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.233212 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-httpd" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.233593 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.240509 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.240714 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="dnsmasq-dns" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.240800 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="dnsmasq-dns" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.240875 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker-log" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.240927 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker-log" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241014 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="init" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241062 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="init" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241114 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220011f2-8778-4a14-82d4-33a07bd33379" containerName="ovn-controller" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241160 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="220011f2-8778-4a14-82d4-33a07bd33379" containerName="ovn-controller" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241237 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerName="galera" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241286 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerName="galera" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241349 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerName="mysql-bootstrap" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241395 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerName="mysql-bootstrap" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241447 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener-log" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241493 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener-log" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241553 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241604 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.241655 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="ovsdbserver-sb" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.241702 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="ovsdbserver-sb" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.234387 4772 generic.go:334] "Generic (PLEG): container finished" podID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerID="7b0085db2ce3021657d7773e88196b66b6759beeca3bff2b51fc3fdf5d6b4bd2" exitCode=2 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.242057 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="ovsdbserver-sb" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243382 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243460 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" containerName="dnsmasq-dns" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243511 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker-log" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243559 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="ovsdbserver-nb" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243607 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e69643a-e8c2-4057-a993-d5506ceeec1b" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243660 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener-log" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243720 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-httpd" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243768 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" containerName="proxy-server" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243818 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" containerName="galera" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243871 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243918 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" containerName="barbican-keystone-listener" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.243965 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" containerName="barbican-worker" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.244013 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a490a71b-c33d-4c94-9592-f97d1d315e81" containerName="openstack-network-exporter" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.244072 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="220011f2-8778-4a14-82d4-33a07bd33379" containerName="ovn-controller" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.247477 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4c0e-account-create-update-wlbfm"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.247534 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerDied","Data":"7b0085db2ce3021657d7773e88196b66b6759beeca3bff2b51fc3fdf5d6b4bd2"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.247578 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.249651 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.250791 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.250876 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5e69643a-e8c2-4057-a993-d5506ceeec1b","Type":"ContainerDied","Data":"6444b6c25763e568fb1ce306052e7e5dc898559abb4fc82fb393de7a9f4a2b66"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.250951 4772 scope.go:117] "RemoveContainer" containerID="7a1429ee13edd2169e8a683ea45dcb648c58812d36d48307ba37a8f39d0a67ce" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.253978 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-97c3-account-create-update-bvlvs" event={"ID":"ef060591-3809-4f0b-974f-0785261db9b9","Type":"ContainerDied","Data":"818d39110910f066b35d697f63a51f4883012a3f77256ac3126d09653c3a60e2"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.254030 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-97c3-account-create-update-bvlvs" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.264328 4772 generic.go:334] "Generic (PLEG): container finished" podID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerID="ad26ca4835a223df0b0aa3065e02d9e54b67030d2b6d0436f1f1a0dd7bf06415" exitCode=0 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.264444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-597699949b-q6msx" event={"ID":"4205dfea-7dc7-496a-9745-fc5e3d0a418a","Type":"ContainerDied","Data":"ad26ca4835a223df0b0aa3065e02d9e54b67030d2b6d0436f1f1a0dd7bf06415"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.274713 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": dial tcp 10.217.0.163:8776: connect: connection refused" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.280322 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fl4nt"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.286785 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" event={"ID":"c16a29a0-7238-4a5e-b892-8f5195a1f486","Type":"ContainerDied","Data":"92e9170b2797b87fe5816f61d1944a7f0cca88f2e0e21f7420f27a5ed25d4005"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.286900 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d86f6cfbc-cwfmc" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.334205 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kvb25"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.341287 4772 scope.go:117] "RemoveContainer" containerID="47a1d8c4913044388b407e6a5c05783d2d3731216d7862873425d28265a5fe05" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.342586 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-659485ddbb-5bnzg" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:44230->10.217.0.162:9311: read: connection reset by peer" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.342859 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-659485ddbb-5bnzg" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:44232->10.217.0.162:9311: read: connection reset by peer" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.349872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4llv\" (UniqueName: \"kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.350105 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.351202 4772 generic.go:334] "Generic (PLEG): container finished" podID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerID="9005da10eaad68221a5ab75b0d10da02a46a7bd38d46bece0339dd56d8e2fc51" exitCode=1 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.351290 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qmppl" event={"ID":"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6","Type":"ContainerDied","Data":"9005da10eaad68221a5ab75b0d10da02a46a7bd38d46bece0339dd56d8e2fc51"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.351789 4772 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-qmppl" secret="" err="secret \"galera-openstack-dockercfg-4dfv4\" not found" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.351828 4772 scope.go:117] "RemoveContainer" containerID="9005da10eaad68221a5ab75b0d10da02a46a7bd38d46bece0339dd56d8e2fc51" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.352097 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-qmppl_openstack(4cbf7469-816d-4e54-a7ad-b5b76d0d59d6)\"" pod="openstack/root-account-create-update-qmppl" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.356853 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kvb25"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.392500 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fl4nt"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.393592 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.393789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cf619242-7348-4de4-a37e-8ebdc4ca54d7","Type":"ContainerDied","Data":"9858c0fc9167c8fdb9fe56212a74207375b7ea71449891249cf75618c47eff4b"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.405357 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.451753 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4llv\" (UniqueName: \"kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.463729 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.458374 4772 projected.go:194] Error preparing data for projected volume kube-api-access-z4llv for pod openstack/keystone-4c0e-account-create-update-wlbfm: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.464113 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv podName:05445a27-d839-4a60-8338-5ee5f2c3f9d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:32.964098289 +0000 UTC m=+1478.944707387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z4llv" (UniqueName: "kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv") pod "keystone-4c0e-account-create-update-wlbfm" (UID: "05445a27-d839-4a60-8338-5ee5f2c3f9d7") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.461445 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-677fb7d6fc-djjsx"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.465116 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-677fb7d6fc-djjsx" podUID="6e790127-8223-4b0c-8a5d-21e1bb15fa30" containerName="keystone-api" containerID="cri-o://468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc" gracePeriod=30 Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.464326 4772 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.467645 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts podName:05445a27-d839-4a60-8338-5ee5f2c3f9d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:32.967631071 +0000 UTC m=+1478.948240169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts") pod "keystone-4c0e-account-create-update-wlbfm" (UID: "05445a27-d839-4a60-8338-5ee5f2c3f9d7") : configmap "openstack-scripts" not found Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.464016 4772 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.467769 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts podName:4cbf7469-816d-4e54-a7ad-b5b76d0d59d6 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:32.967762404 +0000 UTC m=+1478.948371502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts") pod "root-account-create-update-qmppl" (UID: "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6") : configmap "openstack-scripts" not found Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.469593 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-nmvpf"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.505825 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4c0e-account-create-update-wlbfm"] Jan 27 15:31:32 crc kubenswrapper[4772]: E0127 15:31:32.506453 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-z4llv operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-4c0e-account-create-update-wlbfm" podUID="05445a27-d839-4a60-8338-5ee5f2c3f9d7" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.516946 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-nmvpf"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.589872 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.590329 4772 generic.go:334] "Generic (PLEG): container finished" podID="be772158-a71c-448d-8972-014f0d3a9ab8" containerID="c47159ab0aee5087f5a44073988d2ad8d6aaaa0e47ba7702dc2a03eab229b375" exitCode=0 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.590386 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"be772158-a71c-448d-8972-014f0d3a9ab8","Type":"ContainerDied","Data":"c47159ab0aee5087f5a44073988d2ad8d6aaaa0e47ba7702dc2a03eab229b375"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.601234 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qmppl"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.626597 4772 scope.go:117] "RemoveContainer" containerID="a476d84a3741734575b073569a645d9d973c5cdbb39812aa454a7257859db22b" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.635199 4772 generic.go:334] "Generic (PLEG): container finished" podID="9a02b617-28a7-4262-a110-f1c71763ad19" containerID="3114715e24bc63a93ce31ec7ec2cc2fdeaad0a6c7647de22f23d06ac45e3d864" exitCode=0 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.635297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a02b617-28a7-4262-a110-f1c71763ad19","Type":"ContainerDied","Data":"3114715e24bc63a93ce31ec7ec2cc2fdeaad0a6c7647de22f23d06ac45e3d864"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.669215 4772 generic.go:334] "Generic (PLEG): container finished" podID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerID="6481b50eed7f8997cc197c4b50a1b5d1b9aa395b3745aa30ff2d6ee451d23215" exitCode=0 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.670278 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-combined-ca-bundle\") pod \"b20b9215-5398-4100-bac4-763daa5ed222\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.674704 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-config-data\") pod \"b20b9215-5398-4100-bac4-763daa5ed222\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.675808 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kbc8\" (UniqueName: \"kubernetes.io/projected/b20b9215-5398-4100-bac4-763daa5ed222-kube-api-access-8kbc8\") pod \"b20b9215-5398-4100-bac4-763daa5ed222\" (UID: \"b20b9215-5398-4100-bac4-763daa5ed222\") " Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.683565 4772 generic.go:334] "Generic (PLEG): container finished" podID="b20b9215-5398-4100-bac4-763daa5ed222" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" exitCode=0 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.683653 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.687550 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220011f2-8778-4a14-82d4-33a07bd33379" path="/var/lib/kubelet/pods/220011f2-8778-4a14-82d4-33a07bd33379/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.688943 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2" path="/var/lib/kubelet/pods/4270ab9b-f4a9-4d48-9cc2-f25152ee5fb2/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.689532 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce27714-673f-47de-acc3-b6902b534bdd" path="/var/lib/kubelet/pods/4ce27714-673f-47de-acc3-b6902b534bdd/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.690687 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a1d71f-3b00-42c0-92c4-a29fb3d4518c" path="/var/lib/kubelet/pods/57a1d71f-3b00-42c0-92c4-a29fb3d4518c/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.691242 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710edaa6-ba83-4b1f-a49a-769ca1911c9b" path="/var/lib/kubelet/pods/710edaa6-ba83-4b1f-a49a-769ca1911c9b/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.691781 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8322baad-60c1-4d0b-96e3-51038f2e447a" path="/var/lib/kubelet/pods/8322baad-60c1-4d0b-96e3-51038f2e447a/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.692901 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a490a71b-c33d-4c94-9592-f97d1d315e81" path="/var/lib/kubelet/pods/a490a71b-c33d-4c94-9592-f97d1d315e81/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.700057 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbad3a30-e11d-4ae8-9c42-e06b6382c6de" path="/var/lib/kubelet/pods/bbad3a30-e11d-4ae8-9c42-e06b6382c6de/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.700728 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc34a3a4-ad0b-4154-82c9-728227b19732" path="/var/lib/kubelet/pods/dc34a3a4-ad0b-4154-82c9-728227b19732/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.701284 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef900211-2a44-498c-adb6-fec1abcba5ec" path="/var/lib/kubelet/pods/ef900211-2a44-498c-adb6-fec1abcba5ec/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.702307 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fddd5e59-3124-4a05-aafd-92d6aea05f7e" path="/var/lib/kubelet/pods/fddd5e59-3124-4a05-aafd-92d6aea05f7e/volumes" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.705396 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20b9215-5398-4100-bac4-763daa5ed222-kube-api-access-8kbc8" (OuterVolumeSpecName: "kube-api-access-8kbc8") pod "b20b9215-5398-4100-bac4-763daa5ed222" (UID: "b20b9215-5398-4100-bac4-763daa5ed222"). InnerVolumeSpecName "kube-api-access-8kbc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.716824 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b20b9215-5398-4100-bac4-763daa5ed222" (UID: "b20b9215-5398-4100-bac4-763daa5ed222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.734219 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-config-data" (OuterVolumeSpecName: "config-data") pod "b20b9215-5398-4100-bac4-763daa5ed222" (UID: "b20b9215-5398-4100-bac4-763daa5ed222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.755459 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="galera" containerID="cri-o://21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" gracePeriod=30 Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.802821 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kbc8\" (UniqueName: \"kubernetes.io/projected/b20b9215-5398-4100-bac4-763daa5ed222-kube-api-access-8kbc8\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.802853 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.802866 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b20b9215-5398-4100-bac4-763daa5ed222-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.805938 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.805967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343","Type":"ContainerDied","Data":"6481b50eed7f8997cc197c4b50a1b5d1b9aa395b3745aa30ff2d6ee451d23215"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.805987 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b20b9215-5398-4100-bac4-763daa5ed222","Type":"ContainerDied","Data":"2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83"} Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806002 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806019 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-97c3-account-create-update-bvlvs"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806029 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-97c3-account-create-update-bvlvs"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806043 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d86f6cfbc-cwfmc"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806056 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-d86f6cfbc-cwfmc"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806070 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.806080 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.817978 4772 scope.go:117] "RemoveContainer" containerID="3cdc8204c9c28616053d96ae2843e1dddf8646f9b546be00bd78d90869086025" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.916090 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Jan 27 15:31:32 crc kubenswrapper[4772]: I0127 15:31:32.966828 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.004823 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.005576 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4llv\" (UniqueName: \"kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.005680 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.005864 4772 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.005928 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts podName:4cbf7469-816d-4e54-a7ad-b5b76d0d59d6 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:34.005910956 +0000 UTC m=+1479.986520054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts") pod "root-account-create-update-qmppl" (UID: "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6") : configmap "openstack-scripts" not found Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.008329 4772 projected.go:194] Error preparing data for projected volume kube-api-access-z4llv for pod openstack/keystone-4c0e-account-create-update-wlbfm: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.008465 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv podName:05445a27-d839-4a60-8338-5ee5f2c3f9d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:34.00845163 +0000 UTC m=+1479.989060718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z4llv" (UniqueName: "kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv") pod "keystone-4c0e-account-create-update-wlbfm" (UID: "05445a27-d839-4a60-8338-5ee5f2c3f9d7") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.008907 4772 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.008976 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts podName:05445a27-d839-4a60-8338-5ee5f2c3f9d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:34.008958374 +0000 UTC m=+1479.989567502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts") pod "keystone-4c0e-account-create-update-wlbfm" (UID: "05445a27-d839-4a60-8338-5ee5f2c3f9d7") : configmap "openstack-scripts" not found Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.022572 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.029011 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.107501 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nblgh\" (UniqueName: \"kubernetes.io/projected/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-api-access-nblgh\") pod \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.108325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-combined-ca-bundle\") pod \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.108487 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmnhr\" (UniqueName: \"kubernetes.io/projected/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-kube-api-access-pmnhr\") pod \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.108964 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-config-data\") pod \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.109099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-logs\") pod \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.109232 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-combined-ca-bundle\") pod \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.109302 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-config\") pod \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.109378 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-internal-tls-certs\") pod \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.109503 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-certs\") pod \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\" (UID: \"21f54218-5889-4ae9-a7a1-7ed4895ad63c\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.109593 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-public-tls-certs\") pod \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\" (UID: \"93c8f9a4-c6ef-42b8-8543-ff8b5347977e\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.112408 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-kube-api-access-pmnhr" (OuterVolumeSpecName: "kube-api-access-pmnhr") pod "93c8f9a4-c6ef-42b8-8543-ff8b5347977e" (UID: "93c8f9a4-c6ef-42b8-8543-ff8b5347977e"). InnerVolumeSpecName "kube-api-access-pmnhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.114705 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-api-access-nblgh" (OuterVolumeSpecName: "kube-api-access-nblgh") pod "21f54218-5889-4ae9-a7a1-7ed4895ad63c" (UID: "21f54218-5889-4ae9-a7a1-7ed4895ad63c"). InnerVolumeSpecName "kube-api-access-nblgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.115538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-logs" (OuterVolumeSpecName: "logs") pod "93c8f9a4-c6ef-42b8-8543-ff8b5347977e" (UID: "93c8f9a4-c6ef-42b8-8543-ff8b5347977e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.183319 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-config-data" (OuterVolumeSpecName: "config-data") pod "93c8f9a4-c6ef-42b8-8543-ff8b5347977e" (UID: "93c8f9a4-c6ef-42b8-8543-ff8b5347977e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.183537 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "21f54218-5889-4ae9-a7a1-7ed4895ad63c" (UID: "21f54218-5889-4ae9-a7a1-7ed4895ad63c"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.187745 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21f54218-5889-4ae9-a7a1-7ed4895ad63c" (UID: "21f54218-5889-4ae9-a7a1-7ed4895ad63c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.208950 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93c8f9a4-c6ef-42b8-8543-ff8b5347977e" (UID: "93c8f9a4-c6ef-42b8-8543-ff8b5347977e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213094 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nblgh\" (UniqueName: \"kubernetes.io/projected/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-api-access-nblgh\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213268 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213393 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmnhr\" (UniqueName: \"kubernetes.io/projected/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-kube-api-access-pmnhr\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213471 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213546 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213602 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.213654 4772 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.233367 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.241528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "93c8f9a4-c6ef-42b8-8543-ff8b5347977e" (UID: "93c8f9a4-c6ef-42b8-8543-ff8b5347977e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.245117 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "93c8f9a4-c6ef-42b8-8543-ff8b5347977e" (UID: "93c8f9a4-c6ef-42b8-8543-ff8b5347977e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.247263 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.265179 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-597699949b-q6msx" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.270675 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.287068 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.289609 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "21f54218-5889-4ae9-a7a1-7ed4895ad63c" (UID: "21f54218-5889-4ae9-a7a1-7ed4895ad63c"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.315465 4772 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f54218-5889-4ae9-a7a1-7ed4895ad63c-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.315494 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.315503 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/93c8f9a4-c6ef-42b8-8543-ff8b5347977e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.417353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-public-tls-certs\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.417808 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/766c2a26-46ea-41b2-ba0c-2101ec9477d5-logs\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.417846 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be772158-a71c-448d-8972-014f0d3a9ab8-logs\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.417886 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-config-data\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.417915 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-public-tls-certs\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.417964 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-combined-ca-bundle\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418004 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-logs\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418037 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghzrk\" (UniqueName: \"kubernetes.io/projected/9a02b617-28a7-4262-a110-f1c71763ad19-kube-api-access-ghzrk\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418064 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418088 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-kube-api-access-rpm5h\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418117 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418149 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-combined-ca-bundle\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418194 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-combined-ca-bundle\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418219 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data-custom\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418238 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418265 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-internal-tls-certs\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418295 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-logs\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418326 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-internal-tls-certs\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418361 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-httpd-run\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418383 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be772158-a71c-448d-8972-014f0d3a9ab8-etc-machine-id\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418419 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4205dfea-7dc7-496a-9745-fc5e3d0a418a-logs\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418450 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-config-data\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418464 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be772158-a71c-448d-8972-014f0d3a9ab8-logs" (OuterVolumeSpecName: "logs") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418482 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-combined-ca-bundle\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418565 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-httpd-run\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418602 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-internal-tls-certs\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418620 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-internal-tls-certs\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data-custom\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418665 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-config-data\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418687 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-public-tls-certs\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418710 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-scripts\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418743 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqskw\" (UniqueName: \"kubernetes.io/projected/766c2a26-46ea-41b2-ba0c-2101ec9477d5-kube-api-access-sqskw\") pod \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\" (UID: \"766c2a26-46ea-41b2-ba0c-2101ec9477d5\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418771 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418794 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vclj\" (UniqueName: \"kubernetes.io/projected/be772158-a71c-448d-8972-014f0d3a9ab8-kube-api-access-2vclj\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418812 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-scripts\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418837 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-scripts\") pod \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\" (UID: \"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418864 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fknks\" (UniqueName: \"kubernetes.io/projected/4205dfea-7dc7-496a-9745-fc5e3d0a418a-kube-api-access-fknks\") pod \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\" (UID: \"4205dfea-7dc7-496a-9745-fc5e3d0a418a\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418882 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-public-tls-certs\") pod \"be772158-a71c-448d-8972-014f0d3a9ab8\" (UID: \"be772158-a71c-448d-8972-014f0d3a9ab8\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-combined-ca-bundle\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.418916 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-scripts\") pod \"9a02b617-28a7-4262-a110-f1c71763ad19\" (UID: \"9a02b617-28a7-4262-a110-f1c71763ad19\") " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.433390 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/be772158-a71c-448d-8972-014f0d3a9ab8-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.438297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-logs" (OuterVolumeSpecName: "logs") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.442697 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.444449 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a02b617-28a7-4262-a110-f1c71763ad19-kube-api-access-ghzrk" (OuterVolumeSpecName: "kube-api-access-ghzrk") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "kube-api-access-ghzrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.467157 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.475498 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-scripts" (OuterVolumeSpecName: "scripts") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.477549 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766c2a26-46ea-41b2-ba0c-2101ec9477d5-kube-api-access-sqskw" (OuterVolumeSpecName: "kube-api-access-sqskw") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "kube-api-access-sqskw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.477833 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-scripts" (OuterVolumeSpecName: "scripts") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.477912 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-scripts" (OuterVolumeSpecName: "scripts") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.478401 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-logs" (OuterVolumeSpecName: "logs") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.480192 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/be772158-a71c-448d-8972-014f0d3a9ab8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.482486 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4205dfea-7dc7-496a-9745-fc5e3d0a418a-logs" (OuterVolumeSpecName: "logs") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.483155 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.484247 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.484417 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be772158-a71c-448d-8972-014f0d3a9ab8-kube-api-access-2vclj" (OuterVolumeSpecName: "kube-api-access-2vclj") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "kube-api-access-2vclj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.484486 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-scripts" (OuterVolumeSpecName: "scripts") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.484561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4205dfea-7dc7-496a-9745-fc5e3d0a418a-kube-api-access-fknks" (OuterVolumeSpecName: "kube-api-access-fknks") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "kube-api-access-fknks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.485278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.485658 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.491021 4772 scope.go:117] "RemoveContainer" containerID="2e743dfaa62b788cb68a4d553d64cf9affaf8ef6e4da1308fddf4dc259167b69" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.491093 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-kube-api-access-rpm5h" (OuterVolumeSpecName: "kube-api-access-rpm5h") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "kube-api-access-rpm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.506113 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537048 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537085 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537094 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537103 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537112 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/be772158-a71c-448d-8972-014f0d3a9ab8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537120 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4205dfea-7dc7-496a-9745-fc5e3d0a418a-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537128 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537136 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a02b617-28a7-4262-a110-f1c71763ad19-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537145 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537161 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537182 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqskw\" (UniqueName: \"kubernetes.io/projected/766c2a26-46ea-41b2-ba0c-2101ec9477d5-kube-api-access-sqskw\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537192 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vclj\" (UniqueName: \"kubernetes.io/projected/be772158-a71c-448d-8972-014f0d3a9ab8-kube-api-access-2vclj\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537200 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537208 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537216 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fknks\" (UniqueName: \"kubernetes.io/projected/4205dfea-7dc7-496a-9745-fc5e3d0a418a-kube-api-access-fknks\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537224 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537231 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537241 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghzrk\" (UniqueName: \"kubernetes.io/projected/9a02b617-28a7-4262-a110-f1c71763ad19-kube-api-access-ghzrk\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537250 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpm5h\" (UniqueName: \"kubernetes.io/projected/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-kube-api-access-rpm5h\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.537266 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.565520 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.567384 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.584257 4772 scope.go:117] "RemoveContainer" containerID="0d7ac15f647607d8d8b9ab55f639b5ec78749485b0e54cbc048e0727ed5dbce0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.585410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/766c2a26-46ea-41b2-ba0c-2101ec9477d5-logs" (OuterVolumeSpecName: "logs") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.591141 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.594434 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-config-data" (OuterVolumeSpecName: "config-data") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.606566 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.609214 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data" (OuterVolumeSpecName: "config-data") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.616159 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a02b617-28a7-4262-a110-f1c71763ad19" (UID: "9a02b617-28a7-4262-a110-f1c71763ad19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.622588 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.624199 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.639581 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.639779 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.639895 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640228 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a02b617-28a7-4262-a110-f1c71763ad19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640327 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/766c2a26-46ea-41b2-ba0c-2101ec9477d5-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640431 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640496 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640561 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640626 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.640688 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.643782 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data" (OuterVolumeSpecName: "config-data") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.691105 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-config-data" (OuterVolumeSpecName: "config-data") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.696317 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.697285 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e7e78641-77e6-4c89-b5c9-0d6f3c9a9343","Type":"ContainerDied","Data":"64eb2d8855af54c245dc9d145df3ac0064c424271a5cf4af6c9815a1aa8bc16e"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.701511 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.705912 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.706030 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be772158-a71c-448d-8972-014f0d3a9ab8" (UID: "be772158-a71c-448d-8972-014f0d3a9ab8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.706067 4772 generic.go:334] "Generic (PLEG): container finished" podID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerID="db38347574e8ea3471da74617b5c2b8fd8e23430f530dbd434f5aba2a153f9bb" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.706437 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7","Type":"ContainerDied","Data":"db38347574e8ea3471da74617b5c2b8fd8e23430f530dbd434f5aba2a153f9bb"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.706829 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.708960 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.711685 4772 generic.go:334] "Generic (PLEG): container finished" podID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerID="0447c2ea1d147e4cee27fce146e4edc38d746774dc492452f5da3c48df7973bb" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.711728 4772 generic.go:334] "Generic (PLEG): container finished" podID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerID="81bb10c06283521cef14702be02bc4e89a7f82e4ae6c7d56b76d0d05f92797d0" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.711739 4772 generic.go:334] "Generic (PLEG): container finished" podID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerID="a4293d3cbd138216987430f5dab62fa26e55c56743eee0b42dd4fc7797a52afd" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.711781 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerDied","Data":"0447c2ea1d147e4cee27fce146e4edc38d746774dc492452f5da3c48df7973bb"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.711880 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerDied","Data":"81bb10c06283521cef14702be02bc4e89a7f82e4ae6c7d56b76d0d05f92797d0"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.711899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerDied","Data":"a4293d3cbd138216987430f5dab62fa26e55c56743eee0b42dd4fc7797a52afd"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.714473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"be772158-a71c-448d-8972-014f0d3a9ab8","Type":"ContainerDied","Data":"2e68d940e0eebbc1216da3357187ae70827b7d508fb0a26f0e91d9593aac8852"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.714614 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.718744 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-597699949b-q6msx" event={"ID":"4205dfea-7dc7-496a-9745-fc5e3d0a418a","Type":"ContainerDied","Data":"77708c49aaa66488bf09da947ac24b469a4cd3c49071689cbd09cfa6aa9b79b5"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.726332 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-597699949b-q6msx" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.731685 4772 generic.go:334] "Generic (PLEG): container finished" podID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerID="abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.731772 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93c8f9a4-c6ef-42b8-8543-ff8b5347977e","Type":"ContainerDied","Data":"abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.731810 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"93c8f9a4-c6ef-42b8-8543-ff8b5347977e","Type":"ContainerDied","Data":"dfddffa6f559c177ea99d7f7fef5a8fb81a5dc7c7f2005faaf77278166e23279"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.731912 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.748981 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.749015 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.749027 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.749038 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.749047 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.749057 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.749067 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be772158-a71c-448d-8972-014f0d3a9ab8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.750753 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"21f54218-5889-4ae9-a7a1-7ed4895ad63c","Type":"ContainerDied","Data":"cf3bc864ff0528c25cfa09a147802c26a644517c099a85fc5bafd7c4da9534c3"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.750880 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.757719 4772 generic.go:334] "Generic (PLEG): container finished" podID="a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" containerID="faf687181014b14838de86572705cbe5952bdabca1b3fad7e35afc3ce6238c0f" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.757796 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66","Type":"ContainerDied","Data":"faf687181014b14838de86572705cbe5952bdabca1b3fad7e35afc3ce6238c0f"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.763320 4772 generic.go:334] "Generic (PLEG): container finished" podID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerID="3e806373a2604b5465de7a3913d6865c82f0689bac61f26c430950d7d4efb948" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.763385 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"683f458e-44e9-49ea-a66b-4ac91a3f2bc1","Type":"ContainerDied","Data":"3e806373a2604b5465de7a3913d6865c82f0689bac61f26c430950d7d4efb948"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.767917 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.768427 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9a02b617-28a7-4262-a110-f1c71763ad19","Type":"ContainerDied","Data":"1536a68238e83bb2c89cfe9a0fce1841bc4d60d2a518fdc49dc1b005d27a6470"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.768551 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.774620 4772 generic.go:334] "Generic (PLEG): container finished" podID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerID="ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874" exitCode=0 Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.774727 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-659485ddbb-5bnzg" event={"ID":"766c2a26-46ea-41b2-ba0c-2101ec9477d5","Type":"ContainerDied","Data":"ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.774756 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-659485ddbb-5bnzg" event={"ID":"766c2a26-46ea-41b2-ba0c-2101ec9477d5","Type":"ContainerDied","Data":"181aa9237802812b703a88787d1d6892177f6147a0214d407241520c82b45857"} Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.774729 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-659485ddbb-5bnzg" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.779272 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "766c2a26-46ea-41b2-ba0c-2101ec9477d5" (UID: "766c2a26-46ea-41b2-ba0c-2101ec9477d5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.782159 4772 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-qmppl" secret="" err="secret \"galera-openstack-dockercfg-4dfv4\" not found" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.782221 4772 scope.go:117] "RemoveContainer" containerID="9005da10eaad68221a5ab75b0d10da02a46a7bd38d46bece0339dd56d8e2fc51" Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.782433 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-qmppl_openstack(4cbf7469-816d-4e54-a7ad-b5b76d0d59d6)\"" pod="openstack/root-account-create-update-qmppl" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.796592 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.799993 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4205dfea-7dc7-496a-9745-fc5e3d0a418a" (UID: "4205dfea-7dc7-496a-9745-fc5e3d0a418a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.828616 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-config-data" (OuterVolumeSpecName: "config-data") pod "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" (UID: "e7e78641-77e6-4c89-b5c9-0d6f3c9a9343"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.842580 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.843761 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.847567 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Jan 27 15:31:33 crc kubenswrapper[4772]: E0127 15:31:33.847642 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="ovn-northd" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.851817 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4205dfea-7dc7-496a-9745-fc5e3d0a418a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.851853 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/766c2a26-46ea-41b2-ba0c-2101ec9477d5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.851868 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.946217 4772 scope.go:117] "RemoveContainer" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.969188 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.983621 4772 scope.go:117] "RemoveContainer" containerID="6481b50eed7f8997cc197c4b50a1b5d1b9aa395b3745aa30ff2d6ee451d23215" Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.990681 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:31:33 crc kubenswrapper[4772]: I0127 15:31:33.995623 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.007641 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.013456 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.026348 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.049791 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.052405 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.057182 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4llv\" (UniqueName: \"kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.057408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts\") pod \"keystone-4c0e-account-create-update-wlbfm\" (UID: \"05445a27-d839-4a60-8338-5ee5f2c3f9d7\") " pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.057801 4772 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.057874 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts podName:05445a27-d839-4a60-8338-5ee5f2c3f9d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:36.057855095 +0000 UTC m=+1482.038464193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts") pod "keystone-4c0e-account-create-update-wlbfm" (UID: "05445a27-d839-4a60-8338-5ee5f2c3f9d7") : configmap "openstack-scripts" not found Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.058260 4772 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.058292 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts podName:4cbf7469-816d-4e54-a7ad-b5b76d0d59d6 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:36.058281808 +0000 UTC m=+1482.038890906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts") pod "root-account-create-update-qmppl" (UID: "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6") : configmap "openstack-scripts" not found Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.061570 4772 projected.go:194] Error preparing data for projected volume kube-api-access-z4llv for pod openstack/keystone-4c0e-account-create-update-wlbfm: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.061637 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv podName:05445a27-d839-4a60-8338-5ee5f2c3f9d7 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:36.061617904 +0000 UTC m=+1482.042227002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z4llv" (UniqueName: "kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv") pod "keystone-4c0e-account-create-update-wlbfm" (UID: "05445a27-d839-4a60-8338-5ee5f2c3f9d7") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.061827 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.067874 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.082520 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.088723 4772 scope.go:117] "RemoveContainer" containerID="3454f9899adaff309b52934e71697924735c1f269fb473444cba03b5baf4e1e5" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.114869 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-597699949b-q6msx"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.116341 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-597699949b-q6msx"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.120903 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.130887 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.141106 4772 scope.go:117] "RemoveContainer" containerID="c47159ab0aee5087f5a44073988d2ad8d6aaaa0e47ba7702dc2a03eab229b375" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.152049 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-659485ddbb-5bnzg"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158102 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-659485ddbb-5bnzg"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kolla-config\") pod \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158688 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-nova-metadata-tls-certs\") pod \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158741 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-config-data\") pod \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158810 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b44p\" (UniqueName: \"kubernetes.io/projected/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-kube-api-access-6b44p\") pod \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158836 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-logs\") pod \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158861 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-combined-ca-bundle\") pod \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158899 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r75qc\" (UniqueName: \"kubernetes.io/projected/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kube-api-access-r75qc\") pod \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.158952 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-memcached-tls-certs\") pod \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.159000 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-config-data\") pod \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\" (UID: \"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.159033 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-combined-ca-bundle\") pod \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\" (UID: \"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.159230 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-logs" (OuterVolumeSpecName: "logs") pod "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" (UID: "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.159408 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-logs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.159672 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-config-data" (OuterVolumeSpecName: "config-data") pod "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" (UID: "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.160357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" (UID: "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.164051 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kube-api-access-r75qc" (OuterVolumeSpecName: "kube-api-access-r75qc") pod "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" (UID: "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66"). InnerVolumeSpecName "kube-api-access-r75qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.181471 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-kube-api-access-6b44p" (OuterVolumeSpecName: "kube-api-access-6b44p") pod "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" (UID: "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7"). InnerVolumeSpecName "kube-api-access-6b44p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.202410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-config-data" (OuterVolumeSpecName: "config-data") pod "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" (UID: "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.208512 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" (UID: "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.209279 4772 scope.go:117] "RemoveContainer" containerID="26cc6d1f580535edc969fb0f7d0d2e7d716fa8450f944ca1657554f90801529b" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.209659 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" (UID: "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.226271 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" (UID: "a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.239001 4772 scope.go:117] "RemoveContainer" containerID="ad26ca4835a223df0b0aa3065e02d9e54b67030d2b6d0436f1f1a0dd7bf06415" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.250346 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" (UID: "f63bf600-ff03-43a3-92b4-fe8ac68a9bb7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260573 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260612 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b44p\" (UniqueName: \"kubernetes.io/projected/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-kube-api-access-6b44p\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260626 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260639 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r75qc\" (UniqueName: \"kubernetes.io/projected/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kube-api-access-r75qc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260650 4772 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260662 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260672 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260682 4772 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.260691 4772 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.275274 4772 scope.go:117] "RemoveContainer" containerID="f10ed54f4ea68e56be83b8d8387a9768612b5c035b1fc42928132066af5bd689" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.300675 4772 scope.go:117] "RemoveContainer" containerID="abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.352640 4772 scope.go:117] "RemoveContainer" containerID="db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.373286 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.383669 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.384989 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.385014 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerName="nova-cell1-conductor-conductor" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.400871 4772 scope.go:117] "RemoveContainer" containerID="abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.401297 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246\": container with ID starting with abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246 not found: ID does not exist" containerID="abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.401322 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246"} err="failed to get container status \"abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246\": rpc error: code = NotFound desc = could not find container \"abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246\": container with ID starting with abfb528c89657cd0985ff90de17dace11a1be4c50ae49dc95a4a7ec03d093246 not found: ID does not exist" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.401346 4772 scope.go:117] "RemoveContainer" containerID="db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.401583 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597\": container with ID starting with db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597 not found: ID does not exist" containerID="db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.401601 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597"} err="failed to get container status \"db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597\": rpc error: code = NotFound desc = could not find container \"db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597\": container with ID starting with db0ed28f713318389578164d5ba7364ff5e0ca569d4bd32de0483eb615fe7597 not found: ID does not exist" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.401615 4772 scope.go:117] "RemoveContainer" containerID="670d5287e2a9882bc2137122191964eb76c57b36df9c904f50db621c1141ab98" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.434723 4772 scope.go:117] "RemoveContainer" containerID="3114715e24bc63a93ce31ec7ec2cc2fdeaad0a6c7647de22f23d06ac45e3d864" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.486213 4772 scope.go:117] "RemoveContainer" containerID="d767e789b4befb7b8caac693075691222c00bb6ae1189417345706dad41621f9" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.534211 4772 scope.go:117] "RemoveContainer" containerID="ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.613143 4772 scope.go:117] "RemoveContainer" containerID="23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.680064 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f54218-5889-4ae9-a7a1-7ed4895ad63c" path="/var/lib/kubelet/pods/21f54218-5889-4ae9-a7a1-7ed4895ad63c/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.680828 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" path="/var/lib/kubelet/pods/4205dfea-7dc7-496a-9745-fc5e3d0a418a/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.681543 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e69643a-e8c2-4057-a993-d5506ceeec1b" path="/var/lib/kubelet/pods/5e69643a-e8c2-4057-a993-d5506ceeec1b/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.682823 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" path="/var/lib/kubelet/pods/766c2a26-46ea-41b2-ba0c-2101ec9477d5/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.683612 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" path="/var/lib/kubelet/pods/93c8f9a4-c6ef-42b8-8543-ff8b5347977e/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.684288 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" path="/var/lib/kubelet/pods/9a02b617-28a7-4262-a110-f1c71763ad19/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.685651 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20b9215-5398-4100-bac4-763daa5ed222" path="/var/lib/kubelet/pods/b20b9215-5398-4100-bac4-763daa5ed222/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.686135 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" path="/var/lib/kubelet/pods/be772158-a71c-448d-8972-014f0d3a9ab8/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.686765 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c16a29a0-7238-4a5e-b892-8f5195a1f486" path="/var/lib/kubelet/pods/c16a29a0-7238-4a5e-b892-8f5195a1f486/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.687890 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf619242-7348-4de4-a37e-8ebdc4ca54d7" path="/var/lib/kubelet/pods/cf619242-7348-4de4-a37e-8ebdc4ca54d7/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.688695 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" path="/var/lib/kubelet/pods/e7e78641-77e6-4c89-b5c9-0d6f3c9a9343/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.690082 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef060591-3809-4f0b-974f-0785261db9b9" path="/var/lib/kubelet/pods/ef060591-3809-4f0b-974f-0785261db9b9/volumes" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.714610 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.721420 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.730843 4772 scope.go:117] "RemoveContainer" containerID="ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.731701 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874\": container with ID starting with ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874 not found: ID does not exist" containerID="ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.731759 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874"} err="failed to get container status \"ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874\": rpc error: code = NotFound desc = could not find container \"ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874\": container with ID starting with ffbe05081a83d720881627c45e3d405aaf574d1db8fa63481da9c229023c0874 not found: ID does not exist" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.731779 4772 scope.go:117] "RemoveContainer" containerID="23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.733409 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573\": container with ID starting with 23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573 not found: ID does not exist" containerID="23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.733428 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573"} err="failed to get container status \"23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573\": rpc error: code = NotFound desc = could not find container \"23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573\": container with ID starting with 23481794981b6875427087c492230b72f248918903ae28fa47bb73190cfa8573 not found: ID does not exist" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.733470 4772 scope.go:117] "RemoveContainer" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.733749 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83\": container with ID starting with 2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83 not found: ID does not exist" containerID="2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.733794 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83"} err="failed to get container status \"2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83\": rpc error: code = NotFound desc = could not find container \"2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83\": container with ID starting with 2e74d40bce110215c2607c9a9b716bf9d9db61e446fe99bb897518adbdc86d83 not found: ID does not exist" Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.774958 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:34 crc kubenswrapper[4772]: E0127 15:31:34.775051 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data podName:76fdbdb1-d48a-4cd1-8372-78887671dce8 nodeName:}" failed. No retries permitted until 2026-01-27 15:31:42.775031751 +0000 UTC m=+1488.755640859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data") pod "rabbitmq-cell1-server-0" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8") : configmap "rabbitmq-cell1-config-data" not found Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.832815 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"683f458e-44e9-49ea-a66b-4ac91a3f2bc1","Type":"ContainerDied","Data":"bb200c044803c6c5491d60dc192f271f4cdf0adcf18a5f0f12ab40acb77fdf72"} Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.832864 4772 scope.go:117] "RemoveContainer" containerID="112ddc6068b3694383f83c1ffece42788a7623920d1c02ff9f46202f7c8c0d7e" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.832994 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.837653 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b8101bc-2ddf-48ed-9b92-e8f9e5e71938/ovn-northd/0.log" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.837688 4772 generic.go:334] "Generic (PLEG): container finished" podID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerID="f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a" exitCode=139 Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.837729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938","Type":"ContainerDied","Data":"f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a"} Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.841708 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.841722 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f63bf600-ff03-43a3-92b4-fe8ac68a9bb7","Type":"ContainerDied","Data":"a940184dde4998665ff3925c8d268f050f05912d1265137578965cd151d251c3"} Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.857069 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66","Type":"ContainerDied","Data":"022c0f29ec3ec9ea31194094e372dfed87fe074f880cb471419a54885eeba246"} Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.857300 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.864930 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.865451 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aea5ee43-36e3-437d-8aca-b2faedd87c5b","Type":"ContainerDied","Data":"0d9e1d64ee2212bcbce9b483a76517d64478f416567bc79c87cd9fc874d3b4e1"} Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.865622 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4c0e-account-create-update-wlbfm" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.879593 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72jk7\" (UniqueName: \"kubernetes.io/projected/aea5ee43-36e3-437d-8aca-b2faedd87c5b-kube-api-access-72jk7\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.879848 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data\") pod \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.879935 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-scripts\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880016 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-run-httpd\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880109 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-config-data\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880343 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-log-httpd\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880456 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-sg-core-conf-yaml\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880653 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data-custom\") pod \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-etc-machine-id\") pod \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880895 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-ceilometer-tls-certs\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.880984 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-combined-ca-bundle\") pod \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\" (UID: \"aea5ee43-36e3-437d-8aca-b2faedd87c5b\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.881063 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrz6x\" (UniqueName: \"kubernetes.io/projected/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-kube-api-access-rrz6x\") pod \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.881133 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-scripts\") pod \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.881237 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-combined-ca-bundle\") pod \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\" (UID: \"683f458e-44e9-49ea-a66b-4ac91a3f2bc1\") " Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.885304 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.886411 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "683f458e-44e9-49ea-a66b-4ac91a3f2bc1" (UID: "683f458e-44e9-49ea-a66b-4ac91a3f2bc1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.891017 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.894859 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea5ee43-36e3-437d-8aca-b2faedd87c5b-kube-api-access-72jk7" (OuterVolumeSpecName: "kube-api-access-72jk7") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "kube-api-access-72jk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.894985 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-scripts" (OuterVolumeSpecName: "scripts") pod "683f458e-44e9-49ea-a66b-4ac91a3f2bc1" (UID: "683f458e-44e9-49ea-a66b-4ac91a3f2bc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.895292 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "683f458e-44e9-49ea-a66b-4ac91a3f2bc1" (UID: "683f458e-44e9-49ea-a66b-4ac91a3f2bc1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.895516 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-kube-api-access-rrz6x" (OuterVolumeSpecName: "kube-api-access-rrz6x") pod "683f458e-44e9-49ea-a66b-4ac91a3f2bc1" (UID: "683f458e-44e9-49ea-a66b-4ac91a3f2bc1"). InnerVolumeSpecName "kube-api-access-rrz6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.897217 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-scripts" (OuterVolumeSpecName: "scripts") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.908641 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b8101bc-2ddf-48ed-9b92-e8f9e5e71938/ovn-northd/0.log" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.908714 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.948973 4772 scope.go:117] "RemoveContainer" containerID="3e806373a2604b5465de7a3913d6865c82f0689bac61f26c430950d7d4efb948" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.976495 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4c0e-account-create-update-wlbfm"] Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.976622 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "683f458e-44e9-49ea-a66b-4ac91a3f2bc1" (UID: "683f458e-44e9-49ea-a66b-4ac91a3f2bc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987007 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987035 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987044 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrz6x\" (UniqueName: \"kubernetes.io/projected/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-kube-api-access-rrz6x\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987053 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987061 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987087 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72jk7\" (UniqueName: \"kubernetes.io/projected/aea5ee43-36e3-437d-8aca-b2faedd87c5b-kube-api-access-72jk7\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987097 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987105 4772 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987113 4772 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aea5ee43-36e3-437d-8aca-b2faedd87c5b-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.987664 4772 scope.go:117] "RemoveContainer" containerID="db38347574e8ea3471da74617b5c2b8fd8e23430f530dbd434f5aba2a153f9bb" Jan 27 15:31:34 crc kubenswrapper[4772]: I0127 15:31:34.995685 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4c0e-account-create-update-wlbfm"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.002386 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-config-data" (OuterVolumeSpecName: "config-data") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.006229 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.008324 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.013868 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.019010 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.024818 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.027004 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.028779 4772 scope.go:117] "RemoveContainer" containerID="7343cd6a2a5cf705b558b4cc862d749d392235682218489d0106143cb8a5d4bc" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.032099 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data" (OuterVolumeSpecName: "config-data") pod "683f458e-44e9-49ea-a66b-4ac91a3f2bc1" (UID: "683f458e-44e9-49ea-a66b-4ac91a3f2bc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.044662 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.047144 4772 scope.go:117] "RemoveContainer" containerID="faf687181014b14838de86572705cbe5952bdabca1b3fad7e35afc3ce6238c0f" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.066377 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aea5ee43-36e3-437d-8aca-b2faedd87c5b" (UID: "aea5ee43-36e3-437d-8aca-b2faedd87c5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.071325 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.077893 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.077959 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" containerName="nova-scheduler-scheduler" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.087735 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-config\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.087781 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-429qb\" (UniqueName: \"kubernetes.io/projected/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-kube-api-access-429qb\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.087843 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-metrics-certs-tls-certs\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.087880 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-combined-ca-bundle\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.087911 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-rundir\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088012 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-northd-tls-certs\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088037 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-scripts\") pod \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\" (UID: \"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088371 4772 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088383 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088394 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/683f458e-44e9-49ea-a66b-4ac91a3f2bc1-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088403 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088413 4772 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aea5ee43-36e3-437d-8aca-b2faedd87c5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088805 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-config" (OuterVolumeSpecName: "config") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.088990 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.089020 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-scripts" (OuterVolumeSpecName: "scripts") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.090561 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-kube-api-access-429qb" (OuterVolumeSpecName: "kube-api-access-429qb") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "kube-api-access-429qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.092787 4772 scope.go:117] "RemoveContainer" containerID="0447c2ea1d147e4cee27fce146e4edc38d746774dc492452f5da3c48df7973bb" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.115386 4772 scope.go:117] "RemoveContainer" containerID="7b0085db2ce3021657d7773e88196b66b6759beeca3bff2b51fc3fdf5d6b4bd2" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.115859 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.122115 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.124421 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.128261 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.128385 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="galera" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.141784 4772 scope.go:117] "RemoveContainer" containerID="81bb10c06283521cef14702be02bc4e89a7f82e4ae6c7d56b76d0d05f92797d0" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.166299 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.168541 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" (UID: "8b8101bc-2ddf-48ed-9b92-e8f9e5e71938"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.169474 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.184448 4772 scope.go:117] "RemoveContainer" containerID="a4293d3cbd138216987430f5dab62fa26e55c56743eee0b42dd4fc7797a52afd" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.188051 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193674 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193715 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193725 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-rundir\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193737 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05445a27-d839-4a60-8338-5ee5f2c3f9d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193746 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193754 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193762 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193769 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4llv\" (UniqueName: \"kubernetes.io/projected/05445a27-d839-4a60-8338-5ee5f2c3f9d7-kube-api-access-z4llv\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.193779 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-429qb\" (UniqueName: \"kubernetes.io/projected/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938-kube-api-access-429qb\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.198938 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.205176 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.245065 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.396472 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thp4z\" (UniqueName: \"kubernetes.io/projected/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-kube-api-access-thp4z\") pod \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.396627 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts\") pod \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\" (UID: \"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.397422 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" (UID: "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.401213 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-kube-api-access-thp4z" (OuterVolumeSpecName: "kube-api-access-thp4z") pod "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" (UID: "4cbf7469-816d-4e54-a7ad-b5b76d0d59d6"). InnerVolumeSpecName "kube-api-access-thp4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.498577 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thp4z\" (UniqueName: \"kubernetes.io/projected/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-kube-api-access-thp4z\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.498619 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.605452 4772 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Jan 27 15:31:35 crc kubenswrapper[4772]: E0127 15:31:35.606900 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data podName:508c3d5b-212a-46da-9a55-de3f35d7019b nodeName:}" failed. No retries permitted until 2026-01-27 15:31:43.60686792 +0000 UTC m=+1489.587477018 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data") pod "rabbitmq-server-0" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b") : configmap "rabbitmq-config-data" not found Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.788213 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.895564 4772 generic.go:334] "Generic (PLEG): container finished" podID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerID="d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a" exitCode=0 Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.895614 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"76fdbdb1-d48a-4cd1-8372-78887671dce8","Type":"ContainerDied","Data":"d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a"} Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.895646 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.895688 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"76fdbdb1-d48a-4cd1-8372-78887671dce8","Type":"ContainerDied","Data":"09e6c8b66552c99b1f924df5f88d4156d8a5bb2bf8b6bbb8e0fc50cdfa96e1ad"} Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.895709 4772 scope.go:117] "RemoveContainer" containerID="d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.899123 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8b8101bc-2ddf-48ed-9b92-e8f9e5e71938/ovn-northd/0.log" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.899211 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8b8101bc-2ddf-48ed-9b92-e8f9e5e71938","Type":"ContainerDied","Data":"383ce19f4879446a46975b1e3757ca75d5dbab13e103b56af11750ee3019f6bc"} Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.899297 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.909526 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qmppl" event={"ID":"4cbf7469-816d-4e54-a7ad-b5b76d0d59d6","Type":"ContainerDied","Data":"d5deb0f3cfb55cc15d206b2ff6d6a2e4b5ccc8b9efe8608e2073fe3df0f8d559"} Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.909611 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qmppl" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.915798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76fdbdb1-d48a-4cd1-8372-78887671dce8-pod-info\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.915914 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.915953 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-plugins\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916009 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76fdbdb1-d48a-4cd1-8372-78887671dce8-erlang-cookie-secret\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-tls\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916085 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916107 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-erlang-cookie\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916126 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916146 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gbh9\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-kube-api-access-9gbh9\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916200 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-server-conf\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916217 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-plugins-conf\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.916902 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.918387 4772 generic.go:334] "Generic (PLEG): container finished" podID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerID="f002759dea4443f7600e0f76f24481c1604449a5ee31bd8aa53171a2121ec4b2" exitCode=0 Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.918466 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"508c3d5b-212a-46da-9a55-de3f35d7019b","Type":"ContainerDied","Data":"f002759dea4443f7600e0f76f24481c1604449a5ee31bd8aa53171a2121ec4b2"} Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.918538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.918911 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.922268 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.922367 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/76fdbdb1-d48a-4cd1-8372-78887671dce8-pod-info" (OuterVolumeSpecName: "pod-info") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.923033 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.925251 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76fdbdb1-d48a-4cd1-8372-78887671dce8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.926304 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-kube-api-access-9gbh9" (OuterVolumeSpecName: "kube-api-access-9gbh9") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "kube-api-access-9gbh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.939452 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data" (OuterVolumeSpecName: "config-data") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.961832 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-server-conf" (OuterVolumeSpecName: "server-conf") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.978134 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 15:31:35 crc kubenswrapper[4772]: I0127 15:31:35.986678 4772 scope.go:117] "RemoveContainer" containerID="d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017611 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017683 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-confd\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017727 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-plugins\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017759 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-erlang-cookie\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017787 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-server-conf\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017816 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017839 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8h8d\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-kube-api-access-l8h8d\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017883 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/508c3d5b-212a-46da-9a55-de3f35d7019b-pod-info\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017912 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017936 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-plugins-conf\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.017974 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/508c3d5b-212a-46da-9a55-de3f35d7019b-erlang-cookie-secret\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.018056 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd\") pod \"76fdbdb1-d48a-4cd1-8372-78887671dce8\" (UID: \"76fdbdb1-d48a-4cd1-8372-78887671dce8\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.018076 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-tls\") pod \"508c3d5b-212a-46da-9a55-de3f35d7019b\" (UID: \"508c3d5b-212a-46da-9a55-de3f35d7019b\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019791 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019825 4772 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76fdbdb1-d48a-4cd1-8372-78887671dce8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019837 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019850 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019877 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019891 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gbh9\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-kube-api-access-9gbh9\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019902 4772 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019913 4772 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019924 4772 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76fdbdb1-d48a-4cd1-8372-78887671dce8-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.019936 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76fdbdb1-d48a-4cd1-8372-78887671dce8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.023402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.023885 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.024867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: W0127 15:31:36.024943 4772 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/76fdbdb1-d48a-4cd1-8372-78887671dce8/volumes/kubernetes.io~projected/rabbitmq-confd Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.025000 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "76fdbdb1-d48a-4cd1-8372-78887671dce8" (UID: "76fdbdb1-d48a-4cd1-8372-78887671dce8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.028391 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-kube-api-access-l8h8d" (OuterVolumeSpecName: "kube-api-access-l8h8d") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "kube-api-access-l8h8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.028683 4772 scope.go:117] "RemoveContainer" containerID="d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.028475 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: E0127 15:31:36.029924 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a\": container with ID starting with d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a not found: ID does not exist" containerID="d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.029987 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a"} err="failed to get container status \"d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a\": rpc error: code = NotFound desc = could not find container \"d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a\": container with ID starting with d8699d4e2fb6bcbb97c43048a20aeda8d17be226258e1acddb0364ab41c23e4a not found: ID does not exist" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.030014 4772 scope.go:117] "RemoveContainer" containerID="d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594" Jan 27 15:31:36 crc kubenswrapper[4772]: E0127 15:31:36.030622 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594\": container with ID starting with d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594 not found: ID does not exist" containerID="d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.030652 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594"} err="failed to get container status \"d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594\": rpc error: code = NotFound desc = could not find container \"d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594\": container with ID starting with d53d0dfba4b0af64ac6186cc8eb8efb21a1ec89a66c075c5a53ab1db9987e594 not found: ID does not exist" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.030680 4772 scope.go:117] "RemoveContainer" containerID="b1542ba131aec1cffd5520f2969b843d3aa12fe7b4cd60022addce3e73977b99" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.031857 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/508c3d5b-212a-46da-9a55-de3f35d7019b-pod-info" (OuterVolumeSpecName: "pod-info") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.033343 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.033402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/508c3d5b-212a-46da-9a55-de3f35d7019b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.040063 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.044327 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qmppl"] Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.049364 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qmppl"] Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.049978 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data" (OuterVolumeSpecName: "config-data") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.054409 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.060007 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.070039 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-server-conf" (OuterVolumeSpecName: "server-conf") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.092062 4772 scope.go:117] "RemoveContainer" containerID="f351431c9793a13f48f307e65178046dd4ccdc52ebd7ba269a580599ff0da01a" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.115587 4772 scope.go:117] "RemoveContainer" containerID="9005da10eaad68221a5ab75b0d10da02a46a7bd38d46bece0339dd56d8e2fc51" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.116438 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "508c3d5b-212a-46da-9a55-de3f35d7019b" (UID: "508c3d5b-212a-46da-9a55-de3f35d7019b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121709 4772 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/508c3d5b-212a-46da-9a55-de3f35d7019b-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121767 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121781 4772 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121797 4772 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/508c3d5b-212a-46da-9a55-de3f35d7019b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121813 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76fdbdb1-d48a-4cd1-8372-78887671dce8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121825 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121901 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121962 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.121990 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.123883 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/508c3d5b-212a-46da-9a55-de3f35d7019b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.123896 4772 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.123907 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/508c3d5b-212a-46da-9a55-de3f35d7019b-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.123919 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8h8d\" (UniqueName: \"kubernetes.io/projected/508c3d5b-212a-46da-9a55-de3f35d7019b-kube-api-access-l8h8d\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.139868 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.225394 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.244888 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.261742 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.421650 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.531718 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-config-data\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.531789 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksdpm\" (UniqueName: \"kubernetes.io/projected/6e790127-8223-4b0c-8a5d-21e1bb15fa30-kube-api-access-ksdpm\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.531839 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-credential-keys\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.531865 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-fernet-keys\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.531886 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-combined-ca-bundle\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.532680 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-public-tls-certs\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.532777 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-internal-tls-certs\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.532847 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-scripts\") pod \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\" (UID: \"6e790127-8223-4b0c-8a5d-21e1bb15fa30\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.538312 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-scripts" (OuterVolumeSpecName: "scripts") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.538541 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e790127-8223-4b0c-8a5d-21e1bb15fa30-kube-api-access-ksdpm" (OuterVolumeSpecName: "kube-api-access-ksdpm") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "kube-api-access-ksdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.554453 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.558406 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.560845 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-config-data" (OuterVolumeSpecName: "config-data") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.563532 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.580900 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.583691 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6e790127-8223-4b0c-8a5d-21e1bb15fa30" (UID: "6e790127-8223-4b0c-8a5d-21e1bb15fa30"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634805 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634855 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634868 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634877 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634887 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksdpm\" (UniqueName: \"kubernetes.io/projected/6e790127-8223-4b0c-8a5d-21e1bb15fa30-kube-api-access-ksdpm\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634896 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634904 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.634912 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e790127-8223-4b0c-8a5d-21e1bb15fa30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.685269 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05445a27-d839-4a60-8338-5ee5f2c3f9d7" path="/var/lib/kubelet/pods/05445a27-d839-4a60-8338-5ee5f2c3f9d7/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.685654 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" path="/var/lib/kubelet/pods/4cbf7469-816d-4e54-a7ad-b5b76d0d59d6/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.686354 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" path="/var/lib/kubelet/pods/683f458e-44e9-49ea-a66b-4ac91a3f2bc1/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.687512 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" path="/var/lib/kubelet/pods/76fdbdb1-d48a-4cd1-8372-78887671dce8/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.688891 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" path="/var/lib/kubelet/pods/8b8101bc-2ddf-48ed-9b92-e8f9e5e71938/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.690370 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" path="/var/lib/kubelet/pods/a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.691668 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" path="/var/lib/kubelet/pods/aea5ee43-36e3-437d-8aca-b2faedd87c5b/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.692875 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" path="/var/lib/kubelet/pods/f63bf600-ff03-43a3-92b4-fe8ac68a9bb7/volumes" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.829847 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.934143 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e790127-8223-4b0c-8a5d-21e1bb15fa30" containerID="468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc" exitCode=0 Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.934214 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-677fb7d6fc-djjsx" event={"ID":"6e790127-8223-4b0c-8a5d-21e1bb15fa30","Type":"ContainerDied","Data":"468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc"} Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.934518 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-677fb7d6fc-djjsx" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.934593 4772 scope.go:117] "RemoveContainer" containerID="468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.935201 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-677fb7d6fc-djjsx" event={"ID":"6e790127-8223-4b0c-8a5d-21e1bb15fa30","Type":"ContainerDied","Data":"8b30129bf5b3504ae600edeaafe66f62d6f0c11b788461d423310f03199da3c5"} Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.944773 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-galera-tls-certs\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.944853 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxbj\" (UniqueName: \"kubernetes.io/projected/b1515626-5d79-408d-abc1-cb92abd58f3f-kube-api-access-4rxbj\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.944965 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-generated\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.944990 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.945139 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-default\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.945212 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-combined-ca-bundle\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.945263 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-operator-scripts\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.945300 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-kolla-config\") pod \"b1515626-5d79-408d-abc1-cb92abd58f3f\" (UID: \"b1515626-5d79-408d-abc1-cb92abd58f3f\") " Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.946299 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.946394 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.947079 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.947198 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.947644 4772 generic.go:334] "Generic (PLEG): container finished" podID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" exitCode=0 Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.947702 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dbbd3c83-3fde-4b11-8ef0-add837d393ce","Type":"ContainerDied","Data":"788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29"} Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.949179 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1515626-5d79-408d-abc1-cb92abd58f3f-kube-api-access-4rxbj" (OuterVolumeSpecName: "kube-api-access-4rxbj") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "kube-api-access-4rxbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.951632 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"508c3d5b-212a-46da-9a55-de3f35d7019b","Type":"ContainerDied","Data":"044e360ab5ed48dba1c044f12dafd0e510d6847bb09f3238ce3b8c8d2130f226"} Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.951752 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.955193 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.957651 4772 generic.go:334] "Generic (PLEG): container finished" podID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" exitCode=0 Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.957691 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1515626-5d79-408d-abc1-cb92abd58f3f","Type":"ContainerDied","Data":"21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0"} Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.957711 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b1515626-5d79-408d-abc1-cb92abd58f3f","Type":"ContainerDied","Data":"e699d423eedfd6502021873114f8ac6157951b5b24e3387e2b6a5c652a5f6465"} Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.957971 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.973037 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:36 crc kubenswrapper[4772]: I0127 15:31:36.989380 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "b1515626-5d79-408d-abc1-cb92abd58f3f" (UID: "b1515626-5d79-408d-abc1-cb92abd58f3f"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.003305 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.020806 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.028253 4772 scope.go:117] "RemoveContainer" containerID="468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc" Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.028989 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc\": container with ID starting with 468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc not found: ID does not exist" containerID="468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.029038 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc"} err="failed to get container status \"468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc\": rpc error: code = NotFound desc = could not find container \"468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc\": container with ID starting with 468321c234874e808e21c356adbece5162a3e84011f0215b573e541258fb76bc not found: ID does not exist" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.029071 4772 scope.go:117] "RemoveContainer" containerID="f002759dea4443f7600e0f76f24481c1604449a5ee31bd8aa53171a2121ec4b2" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.032349 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046025 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2cp\" (UniqueName: \"kubernetes.io/projected/dbbd3c83-3fde-4b11-8ef0-add837d393ce-kube-api-access-vh2cp\") pod \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046130 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-config-data\") pod \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046215 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-combined-ca-bundle\") pod \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\" (UID: \"dbbd3c83-3fde-4b11-8ef0-add837d393ce\") " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046691 4772 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046720 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxbj\" (UniqueName: \"kubernetes.io/projected/b1515626-5d79-408d-abc1-cb92abd58f3f-kube-api-access-4rxbj\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046740 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046763 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046777 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046790 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1515626-5d79-408d-abc1-cb92abd58f3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046801 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.046811 4772 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b1515626-5d79-408d-abc1-cb92abd58f3f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.049304 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbbd3c83-3fde-4b11-8ef0-add837d393ce-kube-api-access-vh2cp" (OuterVolumeSpecName: "kube-api-access-vh2cp") pod "dbbd3c83-3fde-4b11-8ef0-add837d393ce" (UID: "dbbd3c83-3fde-4b11-8ef0-add837d393ce"). InnerVolumeSpecName "kube-api-access-vh2cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.065816 4772 scope.go:117] "RemoveContainer" containerID="900401625caff4c2d87fe06884c7dcba7f46fdc58e9213b1a6cc2cf36d383e52" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.068972 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-677fb7d6fc-djjsx"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.079483 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbbd3c83-3fde-4b11-8ef0-add837d393ce" (UID: "dbbd3c83-3fde-4b11-8ef0-add837d393ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.079897 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-677fb7d6fc-djjsx"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.080817 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.090927 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-config-data" (OuterVolumeSpecName: "config-data") pod "dbbd3c83-3fde-4b11-8ef0-add837d393ce" (UID: "dbbd3c83-3fde-4b11-8ef0-add837d393ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.096532 4772 scope.go:117] "RemoveContainer" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.109780 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.110042 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.110249 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.110270 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.118922 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.120245 4772 scope.go:117] "RemoveContainer" containerID="3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39" Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.125428 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.126644 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.126677 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.148321 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2cp\" (UniqueName: \"kubernetes.io/projected/dbbd3c83-3fde-4b11-8ef0-add837d393ce-kube-api-access-vh2cp\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.148663 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.148678 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.148692 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbbd3c83-3fde-4b11-8ef0-add837d393ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.148925 4772 scope.go:117] "RemoveContainer" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.149605 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0\": container with ID starting with 21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0 not found: ID does not exist" containerID="21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.149653 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0"} err="failed to get container status \"21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0\": rpc error: code = NotFound desc = could not find container \"21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0\": container with ID starting with 21613f2614f5809a9a792371fe2e685753a1a2fc6ea2f8fa7dcc2390d4bafda0 not found: ID does not exist" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.149680 4772 scope.go:117] "RemoveContainer" containerID="3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39" Jan 27 15:31:37 crc kubenswrapper[4772]: E0127 15:31:37.149955 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39\": container with ID starting with 3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39 not found: ID does not exist" containerID="3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.150732 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39"} err="failed to get container status \"3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39\": rpc error: code = NotFound desc = could not find container \"3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39\": container with ID starting with 3f38ceb1ab131833479b9e418df05230599249523d60e8e11929add232b03e39 not found: ID does not exist" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.290053 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.296116 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.320596 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.350124 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-combined-ca-bundle\") pod \"b83f7578-8113-46c8-be24-5968aa0ca563\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.350269 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq4lp\" (UniqueName: \"kubernetes.io/projected/b83f7578-8113-46c8-be24-5968aa0ca563-kube-api-access-hq4lp\") pod \"b83f7578-8113-46c8-be24-5968aa0ca563\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.350692 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-config-data\") pod \"b83f7578-8113-46c8-be24-5968aa0ca563\" (UID: \"b83f7578-8113-46c8-be24-5968aa0ca563\") " Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.353297 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83f7578-8113-46c8-be24-5968aa0ca563-kube-api-access-hq4lp" (OuterVolumeSpecName: "kube-api-access-hq4lp") pod "b83f7578-8113-46c8-be24-5968aa0ca563" (UID: "b83f7578-8113-46c8-be24-5968aa0ca563"). InnerVolumeSpecName "kube-api-access-hq4lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.367383 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-config-data" (OuterVolumeSpecName: "config-data") pod "b83f7578-8113-46c8-be24-5968aa0ca563" (UID: "b83f7578-8113-46c8-be24-5968aa0ca563"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.379767 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b83f7578-8113-46c8-be24-5968aa0ca563" (UID: "b83f7578-8113-46c8-be24-5968aa0ca563"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.452460 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.452490 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83f7578-8113-46c8-be24-5968aa0ca563-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.452500 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq4lp\" (UniqueName: \"kubernetes.io/projected/b83f7578-8113-46c8-be24-5968aa0ca563-kube-api-access-hq4lp\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.971438 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dbbd3c83-3fde-4b11-8ef0-add837d393ce","Type":"ContainerDied","Data":"4ac8efb7b8696b151d1bdc121a58850ad086edea13390d08276f3048f0eea493"} Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.971493 4772 scope.go:117] "RemoveContainer" containerID="788384a3ae6b89b3eeabbb3fe7578f4cb514172f7c7e0c341ec2b75ed4d75a29" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.971507 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.973492 4772 generic.go:334] "Generic (PLEG): container finished" podID="b83f7578-8113-46c8-be24-5968aa0ca563" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" exitCode=0 Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.973542 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.973553 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b83f7578-8113-46c8-be24-5968aa0ca563","Type":"ContainerDied","Data":"e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e"} Jan 27 15:31:37 crc kubenswrapper[4772]: I0127 15:31:37.973578 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b83f7578-8113-46c8-be24-5968aa0ca563","Type":"ContainerDied","Data":"ee6d65efb4f3df3d96335b7b6b58d4ee20a12c71c0ca644b8c8c4208300d2710"} Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.008515 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.019381 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.025472 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.030009 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.046097 4772 scope.go:117] "RemoveContainer" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.073915 4772 scope.go:117] "RemoveContainer" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" Jan 27 15:31:38 crc kubenswrapper[4772]: E0127 15:31:38.074473 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e\": container with ID starting with e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e not found: ID does not exist" containerID="e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.074509 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e"} err="failed to get container status \"e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e\": rpc error: code = NotFound desc = could not find container \"e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e\": container with ID starting with e92037085b98ccc46bbd64416f98018d2426a17d0883dd17b830d5574a8a0f4e not found: ID does not exist" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.676119 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" path="/var/lib/kubelet/pods/508c3d5b-212a-46da-9a55-de3f35d7019b/volumes" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.678279 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e790127-8223-4b0c-8a5d-21e1bb15fa30" path="/var/lib/kubelet/pods/6e790127-8223-4b0c-8a5d-21e1bb15fa30/volumes" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.679870 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" path="/var/lib/kubelet/pods/b1515626-5d79-408d-abc1-cb92abd58f3f/volumes" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.682656 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" path="/var/lib/kubelet/pods/b83f7578-8113-46c8-be24-5968aa0ca563/volumes" Jan 27 15:31:38 crc kubenswrapper[4772]: I0127 15:31:38.684010 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" path="/var/lib/kubelet/pods/dbbd3c83-3fde-4b11-8ef0-add837d393ce/volumes" Jan 27 15:31:42 crc kubenswrapper[4772]: I0127 15:31:42.058674 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:31:42 crc kubenswrapper[4772]: I0127 15:31:42.059290 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.109314 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.109814 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.110193 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.110239 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.110917 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.112578 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.113940 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:42 crc kubenswrapper[4772]: E0127 15:31:42.113994 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.022364 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.065471 4772 generic.go:334] "Generic (PLEG): container finished" podID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerID="72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37" exitCode=0 Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.065510 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647c88bb6f-wzf82" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.065516 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647c88bb6f-wzf82" event={"ID":"6cf131c4-a5bd-452b-8598-42312c3a0270","Type":"ContainerDied","Data":"72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37"} Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.065545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647c88bb6f-wzf82" event={"ID":"6cf131c4-a5bd-452b-8598-42312c3a0270","Type":"ContainerDied","Data":"eb3fc136e47d75ea92171b2a25f1728b294a61ff0f248fa056a324eadfc98f00"} Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.065567 4772 scope.go:117] "RemoveContainer" containerID="b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.089374 4772 scope.go:117] "RemoveContainer" containerID="72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.106633 4772 scope.go:117] "RemoveContainer" containerID="b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2" Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.107029 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2\": container with ID starting with b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2 not found: ID does not exist" containerID="b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.107069 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2"} err="failed to get container status \"b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2\": rpc error: code = NotFound desc = could not find container \"b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2\": container with ID starting with b96f34157cbed4eef2143feeb0fd51ea8ff8193f4fa6d28ad6a9487061aba8b2 not found: ID does not exist" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.107092 4772 scope.go:117] "RemoveContainer" containerID="72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37" Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.107543 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37\": container with ID starting with 72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37 not found: ID does not exist" containerID="72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.107583 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37"} err="failed to get container status \"72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37\": rpc error: code = NotFound desc = could not find container \"72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37\": container with ID starting with 72824ad39b806a2254b462f1a46f766a404dd5dd1e5172059745c7930bd54b37 not found: ID does not exist" Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.109030 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.109423 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.109672 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.109710 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.110212 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.111778 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.113161 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:47 crc kubenswrapper[4772]: E0127 15:31:47.113293 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153375 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-httpd-config\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153467 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-internal-tls-certs\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153511 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-public-tls-certs\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153553 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brxbx\" (UniqueName: \"kubernetes.io/projected/6cf131c4-a5bd-452b-8598-42312c3a0270-kube-api-access-brxbx\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153618 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-combined-ca-bundle\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-config\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.153736 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-ovndb-tls-certs\") pod \"6cf131c4-a5bd-452b-8598-42312c3a0270\" (UID: \"6cf131c4-a5bd-452b-8598-42312c3a0270\") " Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.158993 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf131c4-a5bd-452b-8598-42312c3a0270-kube-api-access-brxbx" (OuterVolumeSpecName: "kube-api-access-brxbx") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "kube-api-access-brxbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.160878 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.193245 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.195897 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.202036 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-config" (OuterVolumeSpecName: "config") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.208127 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.228932 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6cf131c4-a5bd-452b-8598-42312c3a0270" (UID: "6cf131c4-a5bd-452b-8598-42312c3a0270"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255314 4772 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255353 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brxbx\" (UniqueName: \"kubernetes.io/projected/6cf131c4-a5bd-452b-8598-42312c3a0270-kube-api-access-brxbx\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255389 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255401 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255412 4772 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255422 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.255431 4772 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6cf131c4-a5bd-452b-8598-42312c3a0270-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.395150 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-647c88bb6f-wzf82"] Jan 27 15:31:47 crc kubenswrapper[4772]: I0127 15:31:47.399748 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-647c88bb6f-wzf82"] Jan 27 15:31:48 crc kubenswrapper[4772]: I0127 15:31:48.675960 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" path="/var/lib/kubelet/pods/6cf131c4-a5bd-452b-8598-42312c3a0270/volumes" Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.111408 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.111819 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.332030 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.333369 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.333409 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.334331 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.339690 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:52 crc kubenswrapper[4772]: E0127 15:31:52.339745 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.109793 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.111573 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.111771 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.112102 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.112141 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.113260 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.115641 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Jan 27 15:31:57 crc kubenswrapper[4772]: E0127 15:31:57.115703 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-cqx7r" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.384255 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cqx7r_38ebd422-35c5-4682-8a4d-ca9073728d7c/ovs-vswitchd/0.log" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.385114 4772 generic.go:334] "Generic (PLEG): container finished" podID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" exitCode=137 Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.385181 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerDied","Data":"d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91"} Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.862059 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cqx7r_38ebd422-35c5-4682-8a4d-ca9073728d7c/ovs-vswitchd/0.log" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.863340 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.906535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvchv\" (UniqueName: \"kubernetes.io/projected/38ebd422-35c5-4682-8a4d-ca9073728d7c-kube-api-access-zvchv\") pod \"38ebd422-35c5-4682-8a4d-ca9073728d7c\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.907197 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-lib\") pod \"38ebd422-35c5-4682-8a4d-ca9073728d7c\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.907398 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38ebd422-35c5-4682-8a4d-ca9073728d7c-scripts\") pod \"38ebd422-35c5-4682-8a4d-ca9073728d7c\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.907534 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-etc-ovs\") pod \"38ebd422-35c5-4682-8a4d-ca9073728d7c\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.907643 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-log\") pod \"38ebd422-35c5-4682-8a4d-ca9073728d7c\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908258 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-log" (OuterVolumeSpecName: "var-log") pod "38ebd422-35c5-4682-8a4d-ca9073728d7c" (UID: "38ebd422-35c5-4682-8a4d-ca9073728d7c"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908376 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-lib" (OuterVolumeSpecName: "var-lib") pod "38ebd422-35c5-4682-8a4d-ca9073728d7c" (UID: "38ebd422-35c5-4682-8a4d-ca9073728d7c"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908453 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-run\") pod \"38ebd422-35c5-4682-8a4d-ca9073728d7c\" (UID: \"38ebd422-35c5-4682-8a4d-ca9073728d7c\") " Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908504 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "38ebd422-35c5-4682-8a4d-ca9073728d7c" (UID: "38ebd422-35c5-4682-8a4d-ca9073728d7c"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908877 4772 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-lib\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908896 4772 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-etc-ovs\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908906 4772 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.908931 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-run" (OuterVolumeSpecName: "var-run") pod "38ebd422-35c5-4682-8a4d-ca9073728d7c" (UID: "38ebd422-35c5-4682-8a4d-ca9073728d7c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.909724 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ebd422-35c5-4682-8a4d-ca9073728d7c-scripts" (OuterVolumeSpecName: "scripts") pod "38ebd422-35c5-4682-8a4d-ca9073728d7c" (UID: "38ebd422-35c5-4682-8a4d-ca9073728d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:31:57 crc kubenswrapper[4772]: I0127 15:31:57.912935 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ebd422-35c5-4682-8a4d-ca9073728d7c-kube-api-access-zvchv" (OuterVolumeSpecName: "kube-api-access-zvchv") pod "38ebd422-35c5-4682-8a4d-ca9073728d7c" (UID: "38ebd422-35c5-4682-8a4d-ca9073728d7c"). InnerVolumeSpecName "kube-api-access-zvchv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.010347 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/38ebd422-35c5-4682-8a4d-ca9073728d7c-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.010386 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvchv\" (UniqueName: \"kubernetes.io/projected/38ebd422-35c5-4682-8a4d-ca9073728d7c-kube-api-access-zvchv\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.010395 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38ebd422-35c5-4682-8a4d-ca9073728d7c-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.396314 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cqx7r_38ebd422-35c5-4682-8a4d-ca9073728d7c/ovs-vswitchd/0.log" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.397151 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cqx7r" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.397147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cqx7r" event={"ID":"38ebd422-35c5-4682-8a4d-ca9073728d7c","Type":"ContainerDied","Data":"b4ae3e61c086f91c9c3a7442484ecc85a4bdf545d39601e45239a3351393b9ff"} Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.397241 4772 scope.go:117] "RemoveContainer" containerID="d6579efc0c6a14eb40a1349e6b5e9e288881435286dc3fff811374b436b48c91" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.410546 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerID="0b50101071feccad5793667a8f4849d22482c6d522fac228c249d69d6d557cdf" exitCode=137 Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.410594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"0b50101071feccad5793667a8f4849d22482c6d522fac228c249d69d6d557cdf"} Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.437448 4772 scope.go:117] "RemoveContainer" containerID="4b64614c7f3007f9118f8ed226ede92035da74f66d831e70ce26d6d3d8e9f47b" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.437975 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-cqx7r"] Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.442679 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-cqx7r"] Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.466359 4772 scope.go:117] "RemoveContainer" containerID="c2ec2d9ef51a12150ebe6df637e29030ff2b622c19a7ada45c6cd396c44b8636" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.585222 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.673575 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" path="/var/lib/kubelet/pods/38ebd422-35c5-4682-8a4d-ca9073728d7c/volumes" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.719483 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-lock\") pod \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.719591 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-cache\") pod \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.719644 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") pod \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.719666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxlv4\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-kube-api-access-mxlv4\") pod \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.719703 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef68955-b80c-4732-9e87-0bec53d0b3a0-combined-ca-bundle\") pod \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.719786 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\" (UID: \"3ef68955-b80c-4732-9e87-0bec53d0b3a0\") " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.720125 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-lock" (OuterVolumeSpecName: "lock") pod "3ef68955-b80c-4732-9e87-0bec53d0b3a0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.720221 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-cache" (OuterVolumeSpecName: "cache") pod "3ef68955-b80c-4732-9e87-0bec53d0b3a0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.723007 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3ef68955-b80c-4732-9e87-0bec53d0b3a0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.723405 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-kube-api-access-mxlv4" (OuterVolumeSpecName: "kube-api-access-mxlv4") pod "3ef68955-b80c-4732-9e87-0bec53d0b3a0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0"). InnerVolumeSpecName "kube-api-access-mxlv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.726325 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "swift") pod "3ef68955-b80c-4732-9e87-0bec53d0b3a0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.821119 4772 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-cache\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.821156 4772 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.821199 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxlv4\" (UniqueName: \"kubernetes.io/projected/3ef68955-b80c-4732-9e87-0bec53d0b3a0-kube-api-access-mxlv4\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.821236 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.821248 4772 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3ef68955-b80c-4732-9e87-0bec53d0b3a0-lock\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.835175 4772 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.923343 4772 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:58 crc kubenswrapper[4772]: I0127 15:31:58.972160 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef68955-b80c-4732-9e87-0bec53d0b3a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ef68955-b80c-4732-9e87-0bec53d0b3a0" (UID: "3ef68955-b80c-4732-9e87-0bec53d0b3a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.024325 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef68955-b80c-4732-9e87-0bec53d0b3a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.427497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3ef68955-b80c-4732-9e87-0bec53d0b3a0","Type":"ContainerDied","Data":"7e2686f92b31392fd2420828f9959abe37458794a1d13beae3bf48377776f704"} Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.427559 4772 scope.go:117] "RemoveContainer" containerID="0b50101071feccad5793667a8f4849d22482c6d522fac228c249d69d6d557cdf" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.427596 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.472346 4772 scope.go:117] "RemoveContainer" containerID="8d889567d10b3e8868d76680ff442da2a14216919aae766c356918ec9960b9a4" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.473086 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.480115 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.531239 4772 scope.go:117] "RemoveContainer" containerID="c1cf3012e8501ba3a809e028a1ab49c960d95fb090a04b4dbca6cd01d2de9524" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.548883 4772 scope.go:117] "RemoveContainer" containerID="b0a7c137687a720a7d8c3f84cc586f4b9d3bde7c9bc9e2e0c83a325c2ae23322" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.575274 4772 scope.go:117] "RemoveContainer" containerID="8bbb31c1be222187b0e9b27f07c1ac0fe66d8ad583df4ff6b26fec62ab98cf87" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.597563 4772 scope.go:117] "RemoveContainer" containerID="71b4242b9081be055bfb8bd2db6959d32259cd0c3ee2b95ddde1c1d2154be74b" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.623096 4772 scope.go:117] "RemoveContainer" containerID="bc57f117c387fb10832190ea21f63cdb319308d9390292395fb515e28966d217" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.646979 4772 scope.go:117] "RemoveContainer" containerID="99c9f47c0720632dfecbfc5e9152885ab96d751677b561767c79f0a032ca5cf5" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.671976 4772 scope.go:117] "RemoveContainer" containerID="0c6f6ecf89a4947c23560538762ca73dfe5e13c4acb04e206d91772a3cfc9c49" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.693791 4772 scope.go:117] "RemoveContainer" containerID="94e4c588a745acb16ce919a52f7150cf54119c1c41e94c9e658206e6b58958ed" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.739295 4772 scope.go:117] "RemoveContainer" containerID="494d3ebaeddb756bf375d2bc394a4b4086ee3e25d9a76747552d41c1f40a9737" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.755397 4772 scope.go:117] "RemoveContainer" containerID="ac32767b3784713a66fbfe32a337398a7461aa8ffad58bbfea7ccf6e3c4ee19d" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.770425 4772 scope.go:117] "RemoveContainer" containerID="c3f602f5b8fe5f978c40989adc1d0130c6aaae0dce0fc13d5e34bbe819e8eccb" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.788185 4772 scope.go:117] "RemoveContainer" containerID="5f271cd2dcb6b658cde722402c5b2945c28f4d7486cab8c56e064081779416a1" Jan 27 15:31:59 crc kubenswrapper[4772]: I0127 15:31:59.807224 4772 scope.go:117] "RemoveContainer" containerID="d35aa807e61d39133b8319305719556fcfa6889495c80253864eaf2dc48a450b" Jan 27 15:32:00 crc kubenswrapper[4772]: I0127 15:32:00.673437 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" path="/var/lib/kubelet/pods/3ef68955-b80c-4732-9e87-0bec53d0b3a0/volumes" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.553598 4772 scope.go:117] "RemoveContainer" containerID="e068687fbbe8ba2bc884327a323113a2f9b397134b3783cc71145217f0aced72" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.575988 4772 scope.go:117] "RemoveContainer" containerID="a76c09aaadcee4723d7ef767396afbe7396ff3e3af040a33171b3953859d1cba" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.658106 4772 scope.go:117] "RemoveContainer" containerID="6aa60721dd7c09b05e3a663482308f5a6da188370cc19651da9a73a40e00696f" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.706010 4772 scope.go:117] "RemoveContainer" containerID="a5599751ce46331dd2a224ba692cd6619979f4eb0205e3a54352eb587e777c31" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.732710 4772 scope.go:117] "RemoveContainer" containerID="666a2855e8df449d0b2a9f22d64efe41fc16e80a56e57924cea7c6f56eb00af0" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.762941 4772 scope.go:117] "RemoveContainer" containerID="317ff691da5e191e31778e1d02f29484703e057687e372739fcbc9dd6f8088d2" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.809110 4772 scope.go:117] "RemoveContainer" containerID="d1b5117c10f9331477f591f10a624b08ae6968087cc1bb15580ee055f80a719c" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.831424 4772 scope.go:117] "RemoveContainer" containerID="7ce8beebc480cca9e2ff0700b901cda6f6e2d53f77d8edbfd7e337a2359ae80a" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.858418 4772 scope.go:117] "RemoveContainer" containerID="fcb62876ceaa2921dde5172c985a61fd201c04281f9d06dbb383e8128d91c935" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.876386 4772 scope.go:117] "RemoveContainer" containerID="37cb21cfa353006443b3a1e31571db32c636cbf5e0c7a880cb766a2b91769826" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.913310 4772 scope.go:117] "RemoveContainer" containerID="e93f9f446173d4fd985d40db28827a7f313c9dbe0522a2d3003fa93c8ac7de5e" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.962939 4772 scope.go:117] "RemoveContainer" containerID="d22be9ecfb9cc0389dd0f2e64dbdb2f980e40813484563a0a652ad657fd8f5b7" Jan 27 15:32:02 crc kubenswrapper[4772]: I0127 15:32:02.984044 4772 scope.go:117] "RemoveContainer" containerID="6d6c94667c0ae61eab0c4931fc95c11f862c674ae06fd177d824e395ced6b9a6" Jan 27 15:32:03 crc kubenswrapper[4772]: I0127 15:32:03.000592 4772 scope.go:117] "RemoveContainer" containerID="d569280ad66a5087c9e0aa7b8abe04a7d97361bee2ca7b7c30646e77734ba51d" Jan 27 15:32:03 crc kubenswrapper[4772]: I0127 15:32:03.024428 4772 scope.go:117] "RemoveContainer" containerID="31d9e486da5aa706768e022e398d969ef41f15c9db5b579c83d50ae160db05a7" Jan 27 15:32:12 crc kubenswrapper[4772]: I0127 15:32:12.058898 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:32:12 crc kubenswrapper[4772]: I0127 15:32:12.059294 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.058036 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.058663 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.058708 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.059306 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.059377 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" gracePeriod=600 Jan 27 15:32:42 crc kubenswrapper[4772]: E0127 15:32:42.378529 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.778799 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" exitCode=0 Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.778855 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2"} Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.778972 4772 scope.go:117] "RemoveContainer" containerID="1d1c45659af37dbb5fcad6152d119ca4f804c58006a54555795ff000f3b7aea9" Jan 27 15:32:42 crc kubenswrapper[4772]: I0127 15:32:42.779486 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:32:42 crc kubenswrapper[4772]: E0127 15:32:42.781088 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:32:54 crc kubenswrapper[4772]: I0127 15:32:54.667753 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:32:54 crc kubenswrapper[4772]: E0127 15:32:54.668758 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.738727 4772 scope.go:117] "RemoveContainer" containerID="c454404cb2dabeb6539bab075b0096e5a7ba9d3726f1b7a2ce5d55b30cc778e8" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.762959 4772 scope.go:117] "RemoveContainer" containerID="c63a10e019701dbe41c4487398c76cb4acdd6a0eda99f6edb9df7d6273b71a27" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.813607 4772 scope.go:117] "RemoveContainer" containerID="e1a2cafeb608c7919a88b50bf39a141cb90ef87745db78d4f8f6a94522bb8d2e" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.838875 4772 scope.go:117] "RemoveContainer" containerID="a5ffbaeea04257a22f38554ccc4304785fadfe22ac90bb6e3544b162aab10857" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.854811 4772 scope.go:117] "RemoveContainer" containerID="d265eb93689c326c68ce844d36ec8e13845ff3f6cfb1ed7e88273d0cf4e91cbd" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.890120 4772 scope.go:117] "RemoveContainer" containerID="597636ff183f237bb3b639ea5c67c6b5f6f29f40e362b71df3d4ec02eaa6036b" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.912247 4772 scope.go:117] "RemoveContainer" containerID="478f9eb73f50cba542d4259825587e98caddfe9513876ed4823af8b00681f571" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.932956 4772 scope.go:117] "RemoveContainer" containerID="7cb0416a54334bdd5699afd4b64397c193035c399e5586172a360ff52cd674f9" Jan 27 15:33:03 crc kubenswrapper[4772]: I0127 15:33:03.949915 4772 scope.go:117] "RemoveContainer" containerID="d2de8b3a1c27ebd01b5c3393c6dcb85d202fe549eef0c41d0f9f318c3b15d219" Jan 27 15:33:06 crc kubenswrapper[4772]: I0127 15:33:06.663746 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:33:06 crc kubenswrapper[4772]: E0127 15:33:06.664031 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:33:20 crc kubenswrapper[4772]: I0127 15:33:20.663949 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:33:20 crc kubenswrapper[4772]: E0127 15:33:20.664624 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:33:31 crc kubenswrapper[4772]: I0127 15:33:31.662999 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:33:31 crc kubenswrapper[4772]: E0127 15:33:31.663800 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:33:42 crc kubenswrapper[4772]: I0127 15:33:42.663719 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:33:42 crc kubenswrapper[4772]: E0127 15:33:42.664982 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:33:55 crc kubenswrapper[4772]: I0127 15:33:55.662801 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:33:55 crc kubenswrapper[4772]: E0127 15:33:55.663662 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.119733 4772 scope.go:117] "RemoveContainer" containerID="6ba95c7bf22c812cf8d7d855d86c702f5f7f90db05ec7fc2281ddec549f7d67b" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.153733 4772 scope.go:117] "RemoveContainer" containerID="03f8da2d80772e659c36db9a1b10a6be24dc704eb86ce89c04a5a14351b7726d" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.178426 4772 scope.go:117] "RemoveContainer" containerID="f76b5eae8b9d1fd746edffe9a9f5a02ca0ad4ea09665e63c5dbeacff4753fa40" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.195231 4772 scope.go:117] "RemoveContainer" containerID="ed51d0aa4ae1c7166bbf0464f2b405f79a0faa50f99c4244c9717d1a1fd81db2" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.214759 4772 scope.go:117] "RemoveContainer" containerID="d2b29cba9bcd684a9fa3005c73cbd809102e0bb6c21ef6ed5d53662bb4cdcdaa" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.237904 4772 scope.go:117] "RemoveContainer" containerID="5de6bd74908b324e47419d9f37b784b689e01e1c833ca0e1c7d7483a1e19037c" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.277077 4772 scope.go:117] "RemoveContainer" containerID="f581dd644d182efa5f740dc0b5a2f4adfb865bef3f027972802161889179f1d4" Jan 27 15:34:04 crc kubenswrapper[4772]: I0127 15:34:04.310794 4772 scope.go:117] "RemoveContainer" containerID="b87da5e7b978350e6830e0f65fce50644eee1e1665a4ebcd45d4d0010f0f31d7" Jan 27 15:34:09 crc kubenswrapper[4772]: I0127 15:34:09.662759 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:34:09 crc kubenswrapper[4772]: E0127 15:34:09.663327 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593053 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jm8p7"] Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593775 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="galera" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593786 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="galera" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593807 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="swift-recon-cron" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593813 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="swift-recon-cron" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593820 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593828 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593836 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-notification-agent" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593842 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-notification-agent" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593853 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593859 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593870 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-metadata" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593875 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-metadata" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593882 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593888 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593894 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="openstack-network-exporter" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593900 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="openstack-network-exporter" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593908 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="setup-container" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593914 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="setup-container" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593920 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593926 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-server" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593937 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593943 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-server" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593948 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-expirer" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593953 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-expirer" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593962 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593967 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593974 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="proxy-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593980 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="proxy-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.593988 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerName="mariadb-account-create-update" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.593994 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerName="mariadb-account-create-update" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594003 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="rabbitmq" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594009 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="rabbitmq" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594026 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594032 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-api" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594040 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" containerName="memcached" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594045 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" containerName="memcached" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594055 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="setup-container" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594061 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="setup-container" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594066 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594072 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594080 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594086 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594094 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="mysql-bootstrap" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594099 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="mysql-bootstrap" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594106 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594112 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594123 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-updater" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594129 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-updater" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594136 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594142 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594151 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594157 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594178 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594185 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594193 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f54218-5889-4ae9-a7a1-7ed4895ad63c" containerName="kube-state-metrics" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594198 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f54218-5889-4ae9-a7a1-7ed4895ad63c" containerName="kube-state-metrics" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594208 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server-init" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594214 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server-init" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594222 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="probe" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594227 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="probe" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594239 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerName="mariadb-account-create-update" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594244 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerName="mariadb-account-create-update" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594252 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594258 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594267 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-reaper" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594273 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-reaper" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594281 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594287 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-server" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594294 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594300 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594309 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594315 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594321 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e790127-8223-4b0c-8a5d-21e1bb15fa30" containerName="keystone-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594326 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e790127-8223-4b0c-8a5d-21e1bb15fa30" containerName="keystone-api" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594334 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b20b9215-5398-4100-bac4-763daa5ed222" containerName="nova-cell0-conductor-conductor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594339 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20b9215-5398-4100-bac4-763daa5ed222" containerName="nova-cell0-conductor-conductor" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594349 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="rsync" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594354 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="rsync" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594363 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594369 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-log" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594375 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594380 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-api" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594390 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="cinder-scheduler" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594396 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="cinder-scheduler" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594404 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="ovn-northd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594409 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="ovn-northd" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594416 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-central-agent" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594422 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-central-agent" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594431 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594437 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594443 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="sg-core" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594448 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="sg-core" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594455 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594460 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594470 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" containerName="nova-scheduler-scheduler" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594476 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" containerName="nova-scheduler-scheduler" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594483 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594488 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594495 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerName="nova-cell1-conductor-conductor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594502 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerName="nova-cell1-conductor-conductor" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594513 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594518 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-api" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594529 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="rabbitmq" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594534 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="rabbitmq" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594543 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594549 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594557 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594562 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594571 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594577 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:34:14 crc kubenswrapper[4772]: E0127 15:34:14.594583 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-updater" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594588 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-updater" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594704 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594714 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="openstack-network-exporter" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594725 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594733 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594742 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1515626-5d79-408d-abc1-cb92abd58f3f" containerName="galera" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594750 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594756 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b8101bc-2ddf-48ed-9b92-e8f9e5e71938" containerName="ovn-northd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594763 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594774 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="swift-recon-cron" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594784 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="cinder-scheduler" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594793 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ba01b3-fadf-4bc3-9bc3-7c647cfe7e66" containerName="memcached" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594801 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594807 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594816 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="rsync" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594825 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594830 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a02b617-28a7-4262-a110-f1c71763ad19" containerName="glance-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594839 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c8f9a4-c6ef-42b8-8543-ff8b5347977e" containerName="nova-api-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594845 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e78641-77e6-4c89-b5c9-0d6f3c9a9343" containerName="glance-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594853 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594862 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594870 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbbd3c83-3fde-4b11-8ef0-add837d393ce" containerName="nova-cell1-conductor-conductor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594880 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594889 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="proxy-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594898 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="683f458e-44e9-49ea-a66b-4ac91a3f2bc1" containerName="probe" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594908 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594917 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594923 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="508c3d5b-212a-46da-9a55-de3f35d7019b" containerName="rabbitmq" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594931 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-replicator" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594939 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-auditor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594945 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-central-agent" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594953 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-updater" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594962 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594970 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovs-vswitchd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594977 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="object-expirer" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594984 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63bf600-ff03-43a3-92b4-fe8ac68a9bb7" containerName="nova-metadata-metadata" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.594992 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerName="mariadb-account-create-update" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595000 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbf7469-816d-4e54-a7ad-b5b76d0d59d6" containerName="mariadb-account-create-update" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595007 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b20b9215-5398-4100-bac4-763daa5ed222" containerName="nova-cell0-conductor-conductor" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595015 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e790127-8223-4b0c-8a5d-21e1bb15fa30" containerName="keystone-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595024 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4205dfea-7dc7-496a-9745-fc5e3d0a418a" containerName="placement-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595031 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83f7578-8113-46c8-be24-5968aa0ca563" containerName="nova-scheduler-scheduler" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595040 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf131c4-a5bd-452b-8598-42312c3a0270" containerName="neutron-httpd" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595048 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="sg-core" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595055 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="container-updater" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595061 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595069 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="76fdbdb1-d48a-4cd1-8372-78887671dce8" containerName="rabbitmq" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595078 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f54218-5889-4ae9-a7a1-7ed4895ad63c" containerName="kube-state-metrics" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595085 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="766c2a26-46ea-41b2-ba0c-2101ec9477d5" containerName="barbican-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595092 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea5ee43-36e3-437d-8aca-b2faedd87c5b" containerName="ceilometer-notification-agent" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595100 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="be772158-a71c-448d-8972-014f0d3a9ab8" containerName="cinder-api-log" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595108 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ebd422-35c5-4682-8a4d-ca9073728d7c" containerName="ovsdb-server" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.595115 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef68955-b80c-4732-9e87-0bec53d0b3a0" containerName="account-reaper" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.596105 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.604870 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jm8p7"] Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.639515 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-catalog-content\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.639605 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkw2\" (UniqueName: \"kubernetes.io/projected/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-kube-api-access-fqkw2\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.639628 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-utilities\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.741475 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-catalog-content\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.741554 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkw2\" (UniqueName: \"kubernetes.io/projected/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-kube-api-access-fqkw2\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.741576 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-utilities\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.742124 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-catalog-content\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.742183 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-utilities\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.762583 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkw2\" (UniqueName: \"kubernetes.io/projected/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-kube-api-access-fqkw2\") pod \"certified-operators-jm8p7\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:14 crc kubenswrapper[4772]: I0127 15:34:14.929147 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:15 crc kubenswrapper[4772]: I0127 15:34:15.420314 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jm8p7"] Jan 27 15:34:15 crc kubenswrapper[4772]: I0127 15:34:15.517320 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerStarted","Data":"8a2e691ba04205978b1b1471c0a34857459b9169f2b1389dbdaf100526709579"} Jan 27 15:34:16 crc kubenswrapper[4772]: I0127 15:34:16.531095 4772 generic.go:334] "Generic (PLEG): container finished" podID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerID="53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8" exitCode=0 Jan 27 15:34:16 crc kubenswrapper[4772]: I0127 15:34:16.531496 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerDied","Data":"53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8"} Jan 27 15:34:19 crc kubenswrapper[4772]: I0127 15:34:19.555778 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerStarted","Data":"a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9"} Jan 27 15:34:20 crc kubenswrapper[4772]: I0127 15:34:20.570791 4772 generic.go:334] "Generic (PLEG): container finished" podID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerID="a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9" exitCode=0 Jan 27 15:34:20 crc kubenswrapper[4772]: I0127 15:34:20.570911 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerDied","Data":"a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9"} Jan 27 15:34:21 crc kubenswrapper[4772]: I0127 15:34:21.663060 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:34:21 crc kubenswrapper[4772]: E0127 15:34:21.664917 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:34:22 crc kubenswrapper[4772]: I0127 15:34:22.588625 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerStarted","Data":"ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad"} Jan 27 15:34:22 crc kubenswrapper[4772]: I0127 15:34:22.620562 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jm8p7" podStartSLOduration=3.441936655 podStartE2EDuration="8.620525328s" podCreationTimestamp="2026-01-27 15:34:14 +0000 UTC" firstStartedPulling="2026-01-27 15:34:16.533407582 +0000 UTC m=+1642.514016720" lastFinishedPulling="2026-01-27 15:34:21.711996295 +0000 UTC m=+1647.692605393" observedRunningTime="2026-01-27 15:34:22.614740101 +0000 UTC m=+1648.595349239" watchObservedRunningTime="2026-01-27 15:34:22.620525328 +0000 UTC m=+1648.601134466" Jan 27 15:34:24 crc kubenswrapper[4772]: I0127 15:34:24.929894 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:24 crc kubenswrapper[4772]: I0127 15:34:24.929958 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:24 crc kubenswrapper[4772]: I0127 15:34:24.976463 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:34 crc kubenswrapper[4772]: I0127 15:34:34.979241 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:35 crc kubenswrapper[4772]: I0127 15:34:35.040142 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jm8p7"] Jan 27 15:34:35 crc kubenswrapper[4772]: I0127 15:34:35.687462 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jm8p7" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="registry-server" containerID="cri-o://ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad" gracePeriod=2 Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.090198 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.115325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqkw2\" (UniqueName: \"kubernetes.io/projected/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-kube-api-access-fqkw2\") pod \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.115513 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-utilities\") pod \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.115588 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-catalog-content\") pod \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\" (UID: \"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078\") " Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.118515 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-utilities" (OuterVolumeSpecName: "utilities") pod "72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" (UID: "72fd7bb8-3c20-4a0a-b0eb-94f4e9059078"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.123537 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-kube-api-access-fqkw2" (OuterVolumeSpecName: "kube-api-access-fqkw2") pod "72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" (UID: "72fd7bb8-3c20-4a0a-b0eb-94f4e9059078"). InnerVolumeSpecName "kube-api-access-fqkw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.183998 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" (UID: "72fd7bb8-3c20-4a0a-b0eb-94f4e9059078"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.217299 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.217617 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.217688 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqkw2\" (UniqueName: \"kubernetes.io/projected/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078-kube-api-access-fqkw2\") on node \"crc\" DevicePath \"\"" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.663425 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:34:36 crc kubenswrapper[4772]: E0127 15:34:36.664222 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.698905 4772 generic.go:334] "Generic (PLEG): container finished" podID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerID="ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad" exitCode=0 Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.698956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerDied","Data":"ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad"} Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.698988 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jm8p7" event={"ID":"72fd7bb8-3c20-4a0a-b0eb-94f4e9059078","Type":"ContainerDied","Data":"8a2e691ba04205978b1b1471c0a34857459b9169f2b1389dbdaf100526709579"} Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.699008 4772 scope.go:117] "RemoveContainer" containerID="ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.699005 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jm8p7" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.742339 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jm8p7"] Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.744592 4772 scope.go:117] "RemoveContainer" containerID="a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.748597 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jm8p7"] Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.774533 4772 scope.go:117] "RemoveContainer" containerID="53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.801493 4772 scope.go:117] "RemoveContainer" containerID="ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad" Jan 27 15:34:36 crc kubenswrapper[4772]: E0127 15:34:36.802296 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad\": container with ID starting with ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad not found: ID does not exist" containerID="ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.802365 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad"} err="failed to get container status \"ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad\": rpc error: code = NotFound desc = could not find container \"ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad\": container with ID starting with ec057535eadf435a5c1fbd29551134ac66e1d9025e2bb32c9f6a0f8140dd95ad not found: ID does not exist" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.802408 4772 scope.go:117] "RemoveContainer" containerID="a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9" Jan 27 15:34:36 crc kubenswrapper[4772]: E0127 15:34:36.803213 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9\": container with ID starting with a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9 not found: ID does not exist" containerID="a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.803258 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9"} err="failed to get container status \"a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9\": rpc error: code = NotFound desc = could not find container \"a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9\": container with ID starting with a0cad38dce3bede1d6985650b216e3e0c1590ff7d27f9064db990057561ca0d9 not found: ID does not exist" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.803286 4772 scope.go:117] "RemoveContainer" containerID="53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8" Jan 27 15:34:36 crc kubenswrapper[4772]: E0127 15:34:36.804630 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8\": container with ID starting with 53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8 not found: ID does not exist" containerID="53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8" Jan 27 15:34:36 crc kubenswrapper[4772]: I0127 15:34:36.804687 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8"} err="failed to get container status \"53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8\": rpc error: code = NotFound desc = could not find container \"53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8\": container with ID starting with 53008a4df13b2b0f0aaa84c7676410cc85c7c9299c329127da8a6c0db94363b8 not found: ID does not exist" Jan 27 15:34:38 crc kubenswrapper[4772]: I0127 15:34:38.671443 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" path="/var/lib/kubelet/pods/72fd7bb8-3c20-4a0a-b0eb-94f4e9059078/volumes" Jan 27 15:34:51 crc kubenswrapper[4772]: I0127 15:34:51.662799 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:34:51 crc kubenswrapper[4772]: E0127 15:34:51.663565 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.401908 4772 scope.go:117] "RemoveContainer" containerID="9627fca4ce2bbd20c54de88fa2250d98bc1976636644d325a8225826fd2e9ef2" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.436621 4772 scope.go:117] "RemoveContainer" containerID="f28ff63f10f8899bc8cd8fd5a42bd4249a187e430ea97934ef6b489554310751" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.472505 4772 scope.go:117] "RemoveContainer" containerID="4da9288c82c7401f434d2a53ff336e0d653eb3932d204eafc2869a5860cee4bc" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.537250 4772 scope.go:117] "RemoveContainer" containerID="9853bf54eae9ce0f1c3b8ddee31101fe10bc44f0b0f41d495f936c0ac3cc7ec8" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.570812 4772 scope.go:117] "RemoveContainer" containerID="48911a4a107b6bf266b45bdb20df360ce0efcf35791daa4bc1413cb966d28fb0" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.601930 4772 scope.go:117] "RemoveContainer" containerID="46996df047d6fd10b3034c52a93ce3634cebbfdcb4bf44854f66da5e6d342110" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.631962 4772 scope.go:117] "RemoveContainer" containerID="4c6da56f01306accbad60e3ba02a91f4cc6ed8bb905bd9286671fd7f32153ed5" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.662147 4772 scope.go:117] "RemoveContainer" containerID="dce84557790ce392eba68b822eea435ede1d05fd9a392c9bd393123a9c7bf467" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.715545 4772 scope.go:117] "RemoveContainer" containerID="e995550ae720943eacfd405b30c920c20d450c9bc6c2389b27261b188859406e" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.735953 4772 scope.go:117] "RemoveContainer" containerID="e1b312b7631d415f567909a3003da4cdfd7208b6894d1397aa7da34098746b5a" Jan 27 15:35:04 crc kubenswrapper[4772]: I0127 15:35:04.760654 4772 scope.go:117] "RemoveContainer" containerID="35964dfe2e497930630aeb0996d17bf7bbe0e9d5e7bfb1d7efca05167ac578fc" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.438506 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mcqtt"] Jan 27 15:35:06 crc kubenswrapper[4772]: E0127 15:35:06.439357 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="extract-content" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.439379 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="extract-content" Jan 27 15:35:06 crc kubenswrapper[4772]: E0127 15:35:06.439414 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="registry-server" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.439426 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="registry-server" Jan 27 15:35:06 crc kubenswrapper[4772]: E0127 15:35:06.439453 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="extract-utilities" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.439466 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="extract-utilities" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.439732 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="72fd7bb8-3c20-4a0a-b0eb-94f4e9059078" containerName="registry-server" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.441469 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.442454 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcqtt"] Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.539318 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-catalog-content\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.539376 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-utilities\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.539407 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxnbq\" (UniqueName: \"kubernetes.io/projected/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-kube-api-access-dxnbq\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.640323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-catalog-content\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.640377 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-utilities\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.640404 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxnbq\" (UniqueName: \"kubernetes.io/projected/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-kube-api-access-dxnbq\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.640918 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-catalog-content\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.640973 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-utilities\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.663253 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:35:06 crc kubenswrapper[4772]: E0127 15:35:06.663488 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.672035 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxnbq\" (UniqueName: \"kubernetes.io/projected/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-kube-api-access-dxnbq\") pod \"community-operators-mcqtt\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:06 crc kubenswrapper[4772]: I0127 15:35:06.783741 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:07 crc kubenswrapper[4772]: I0127 15:35:07.261264 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcqtt"] Jan 27 15:35:07 crc kubenswrapper[4772]: I0127 15:35:07.978992 4772 generic.go:334] "Generic (PLEG): container finished" podID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerID="91ae6cb76970d4cc89de236593aee41408991154309c0dac8ac8245c02925dfb" exitCode=0 Jan 27 15:35:07 crc kubenswrapper[4772]: I0127 15:35:07.979047 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerDied","Data":"91ae6cb76970d4cc89de236593aee41408991154309c0dac8ac8245c02925dfb"} Jan 27 15:35:07 crc kubenswrapper[4772]: I0127 15:35:07.979113 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerStarted","Data":"72909f838bde105e1566f87719019a2bb56b2533e276fbb1c841e04408b58a81"} Jan 27 15:35:07 crc kubenswrapper[4772]: I0127 15:35:07.981223 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:35:08 crc kubenswrapper[4772]: I0127 15:35:08.996990 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerStarted","Data":"4d6928d9c333649d0fd45228aacd272e5472427e08d477b556c71fa0f31fd418"} Jan 27 15:35:10 crc kubenswrapper[4772]: I0127 15:35:10.005659 4772 generic.go:334] "Generic (PLEG): container finished" podID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerID="4d6928d9c333649d0fd45228aacd272e5472427e08d477b556c71fa0f31fd418" exitCode=0 Jan 27 15:35:10 crc kubenswrapper[4772]: I0127 15:35:10.005709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerDied","Data":"4d6928d9c333649d0fd45228aacd272e5472427e08d477b556c71fa0f31fd418"} Jan 27 15:35:11 crc kubenswrapper[4772]: I0127 15:35:11.031534 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerStarted","Data":"9768bde0eb8c6061a919ebcea3ee20bb86649ff6c879aaef71e49fd52c6736fe"} Jan 27 15:35:11 crc kubenswrapper[4772]: I0127 15:35:11.057282 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mcqtt" podStartSLOduration=2.443619788 podStartE2EDuration="5.057262192s" podCreationTimestamp="2026-01-27 15:35:06 +0000 UTC" firstStartedPulling="2026-01-27 15:35:07.980799129 +0000 UTC m=+1693.961408237" lastFinishedPulling="2026-01-27 15:35:10.594441533 +0000 UTC m=+1696.575050641" observedRunningTime="2026-01-27 15:35:11.051768214 +0000 UTC m=+1697.032377332" watchObservedRunningTime="2026-01-27 15:35:11.057262192 +0000 UTC m=+1697.037871290" Jan 27 15:35:16 crc kubenswrapper[4772]: I0127 15:35:16.784211 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:16 crc kubenswrapper[4772]: I0127 15:35:16.784854 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:16 crc kubenswrapper[4772]: I0127 15:35:16.855607 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:17 crc kubenswrapper[4772]: I0127 15:35:17.147770 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:17 crc kubenswrapper[4772]: I0127 15:35:17.207863 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mcqtt"] Jan 27 15:35:19 crc kubenswrapper[4772]: I0127 15:35:19.097963 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mcqtt" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="registry-server" containerID="cri-o://9768bde0eb8c6061a919ebcea3ee20bb86649ff6c879aaef71e49fd52c6736fe" gracePeriod=2 Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.112432 4772 generic.go:334] "Generic (PLEG): container finished" podID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerID="9768bde0eb8c6061a919ebcea3ee20bb86649ff6c879aaef71e49fd52c6736fe" exitCode=0 Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.112537 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerDied","Data":"9768bde0eb8c6061a919ebcea3ee20bb86649ff6c879aaef71e49fd52c6736fe"} Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.663651 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:35:20 crc kubenswrapper[4772]: E0127 15:35:20.664082 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.871407 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.973442 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-catalog-content\") pod \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.973863 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-utilities\") pod \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.974245 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxnbq\" (UniqueName: \"kubernetes.io/projected/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-kube-api-access-dxnbq\") pod \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\" (UID: \"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a\") " Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.978277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-utilities" (OuterVolumeSpecName: "utilities") pod "ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" (UID: "ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:35:20 crc kubenswrapper[4772]: I0127 15:35:20.998557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-kube-api-access-dxnbq" (OuterVolumeSpecName: "kube-api-access-dxnbq") pod "ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" (UID: "ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a"). InnerVolumeSpecName "kube-api-access-dxnbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.077140 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxnbq\" (UniqueName: \"kubernetes.io/projected/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-kube-api-access-dxnbq\") on node \"crc\" DevicePath \"\"" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.077201 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.127357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcqtt" event={"ID":"ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a","Type":"ContainerDied","Data":"72909f838bde105e1566f87719019a2bb56b2533e276fbb1c841e04408b58a81"} Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.128299 4772 scope.go:117] "RemoveContainer" containerID="9768bde0eb8c6061a919ebcea3ee20bb86649ff6c879aaef71e49fd52c6736fe" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.128530 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcqtt" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.150010 4772 scope.go:117] "RemoveContainer" containerID="4d6928d9c333649d0fd45228aacd272e5472427e08d477b556c71fa0f31fd418" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.168681 4772 scope.go:117] "RemoveContainer" containerID="91ae6cb76970d4cc89de236593aee41408991154309c0dac8ac8245c02925dfb" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.300030 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" (UID: "ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.380734 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.460249 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mcqtt"] Jan 27 15:35:21 crc kubenswrapper[4772]: I0127 15:35:21.467031 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mcqtt"] Jan 27 15:35:22 crc kubenswrapper[4772]: I0127 15:35:22.674791 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" path="/var/lib/kubelet/pods/ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a/volumes" Jan 27 15:35:31 crc kubenswrapper[4772]: I0127 15:35:31.663062 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:35:31 crc kubenswrapper[4772]: E0127 15:35:31.663904 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:35:44 crc kubenswrapper[4772]: I0127 15:35:44.667734 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:35:44 crc kubenswrapper[4772]: E0127 15:35:44.668669 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:35:58 crc kubenswrapper[4772]: I0127 15:35:58.664505 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:35:58 crc kubenswrapper[4772]: E0127 15:35:58.665833 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:36:04 crc kubenswrapper[4772]: I0127 15:36:04.985567 4772 scope.go:117] "RemoveContainer" containerID="e1df482be0829e766abad6c9eb6842ba0e9d9f6fb517127a47f819ce8b296c7d" Jan 27 15:36:05 crc kubenswrapper[4772]: I0127 15:36:05.053817 4772 scope.go:117] "RemoveContainer" containerID="dfac9725f6b1a542e430af600747ad2b7e4c5c445357868cef0cc0fe2f4dae49" Jan 27 15:36:05 crc kubenswrapper[4772]: I0127 15:36:05.075364 4772 scope.go:117] "RemoveContainer" containerID="bea9ecc5c8bd7f22996f379a16987a5468d25478afcbfdd986751cd73382ded7" Jan 27 15:36:09 crc kubenswrapper[4772]: I0127 15:36:09.663784 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:36:09 crc kubenswrapper[4772]: E0127 15:36:09.664349 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:36:24 crc kubenswrapper[4772]: I0127 15:36:24.668697 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:36:24 crc kubenswrapper[4772]: E0127 15:36:24.669523 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:36:36 crc kubenswrapper[4772]: I0127 15:36:36.663599 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:36:36 crc kubenswrapper[4772]: E0127 15:36:36.664896 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:36:51 crc kubenswrapper[4772]: I0127 15:36:51.662871 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:36:51 crc kubenswrapper[4772]: E0127 15:36:51.663663 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:37:05 crc kubenswrapper[4772]: I0127 15:37:05.214871 4772 scope.go:117] "RemoveContainer" containerID="7c4a200cbf0e299c55e6a175696503b17a34325960db8f2addd09db07bdebe00" Jan 27 15:37:06 crc kubenswrapper[4772]: I0127 15:37:06.663714 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:37:06 crc kubenswrapper[4772]: E0127 15:37:06.664312 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:37:19 crc kubenswrapper[4772]: I0127 15:37:19.664155 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:37:19 crc kubenswrapper[4772]: E0127 15:37:19.665773 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:37:31 crc kubenswrapper[4772]: I0127 15:37:31.662984 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:37:31 crc kubenswrapper[4772]: E0127 15:37:31.663771 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:37:44 crc kubenswrapper[4772]: I0127 15:37:44.669816 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:37:45 crc kubenswrapper[4772]: I0127 15:37:45.204604 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52"} Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.377734 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7f4lg"] Jan 27 15:39:28 crc kubenswrapper[4772]: E0127 15:39:28.378731 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="extract-utilities" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.378747 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="extract-utilities" Jan 27 15:39:28 crc kubenswrapper[4772]: E0127 15:39:28.378761 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="registry-server" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.378771 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="registry-server" Jan 27 15:39:28 crc kubenswrapper[4772]: E0127 15:39:28.378783 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="extract-content" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.378791 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="extract-content" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.378969 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae1aaeb5-4ac9-42fc-8a59-a47ae8baf54a" containerName="registry-server" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.380257 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.403987 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7f4lg"] Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.457783 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-utilities\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.457849 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-catalog-content\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.457871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsj5g\" (UniqueName: \"kubernetes.io/projected/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-kube-api-access-vsj5g\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.559341 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-utilities\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.559405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-catalog-content\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.559428 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsj5g\" (UniqueName: \"kubernetes.io/projected/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-kube-api-access-vsj5g\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.560069 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-catalog-content\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.560409 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-utilities\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.583821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsj5g\" (UniqueName: \"kubernetes.io/projected/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-kube-api-access-vsj5g\") pod \"redhat-operators-7f4lg\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:28 crc kubenswrapper[4772]: I0127 15:39:28.702231 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:29 crc kubenswrapper[4772]: I0127 15:39:29.150487 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7f4lg"] Jan 27 15:39:29 crc kubenswrapper[4772]: I0127 15:39:29.958888 4772 generic.go:334] "Generic (PLEG): container finished" podID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerID="096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7" exitCode=0 Jan 27 15:39:29 crc kubenswrapper[4772]: I0127 15:39:29.959037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerDied","Data":"096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7"} Jan 27 15:39:29 crc kubenswrapper[4772]: I0127 15:39:29.959512 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerStarted","Data":"55f822061f7b79771202dd86ea827bfce4a99cda83a00c362905aef55a31aa17"} Jan 27 15:39:30 crc kubenswrapper[4772]: I0127 15:39:30.969876 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerStarted","Data":"cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c"} Jan 27 15:39:31 crc kubenswrapper[4772]: I0127 15:39:31.978484 4772 generic.go:334] "Generic (PLEG): container finished" podID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerID="cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c" exitCode=0 Jan 27 15:39:31 crc kubenswrapper[4772]: I0127 15:39:31.978546 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerDied","Data":"cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c"} Jan 27 15:39:32 crc kubenswrapper[4772]: I0127 15:39:32.989432 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerStarted","Data":"098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75"} Jan 27 15:39:33 crc kubenswrapper[4772]: I0127 15:39:33.009714 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7f4lg" podStartSLOduration=2.462876389 podStartE2EDuration="5.009694064s" podCreationTimestamp="2026-01-27 15:39:28 +0000 UTC" firstStartedPulling="2026-01-27 15:39:29.962101945 +0000 UTC m=+1955.942711053" lastFinishedPulling="2026-01-27 15:39:32.50891963 +0000 UTC m=+1958.489528728" observedRunningTime="2026-01-27 15:39:33.008686394 +0000 UTC m=+1958.989295492" watchObservedRunningTime="2026-01-27 15:39:33.009694064 +0000 UTC m=+1958.990303162" Jan 27 15:39:38 crc kubenswrapper[4772]: I0127 15:39:38.703550 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:38 crc kubenswrapper[4772]: I0127 15:39:38.704211 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:38 crc kubenswrapper[4772]: I0127 15:39:38.763075 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:39 crc kubenswrapper[4772]: I0127 15:39:39.077790 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:39 crc kubenswrapper[4772]: I0127 15:39:39.122522 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7f4lg"] Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.047291 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7f4lg" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="registry-server" containerID="cri-o://098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75" gracePeriod=2 Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.474541 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.551826 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsj5g\" (UniqueName: \"kubernetes.io/projected/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-kube-api-access-vsj5g\") pod \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.551893 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-utilities\") pod \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.551939 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-catalog-content\") pod \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\" (UID: \"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041\") " Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.552970 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-utilities" (OuterVolumeSpecName: "utilities") pod "f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" (UID: "f3e87a75-cf2d-4ee1-b89b-75cbce1a3041"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.557063 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-kube-api-access-vsj5g" (OuterVolumeSpecName: "kube-api-access-vsj5g") pod "f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" (UID: "f3e87a75-cf2d-4ee1-b89b-75cbce1a3041"). InnerVolumeSpecName "kube-api-access-vsj5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.653727 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsj5g\" (UniqueName: \"kubernetes.io/projected/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-kube-api-access-vsj5g\") on node \"crc\" DevicePath \"\"" Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.653767 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.675835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" (UID: "f3e87a75-cf2d-4ee1-b89b-75cbce1a3041"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:39:41 crc kubenswrapper[4772]: I0127 15:39:41.760368 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.059444 4772 generic.go:334] "Generic (PLEG): container finished" podID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerID="098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75" exitCode=0 Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.059506 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerDied","Data":"098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75"} Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.059544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7f4lg" event={"ID":"f3e87a75-cf2d-4ee1-b89b-75cbce1a3041","Type":"ContainerDied","Data":"55f822061f7b79771202dd86ea827bfce4a99cda83a00c362905aef55a31aa17"} Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.059570 4772 scope.go:117] "RemoveContainer" containerID="098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.059613 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7f4lg" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.086840 4772 scope.go:117] "RemoveContainer" containerID="cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.114296 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7f4lg"] Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.120580 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7f4lg"] Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.138051 4772 scope.go:117] "RemoveContainer" containerID="096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.162100 4772 scope.go:117] "RemoveContainer" containerID="098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75" Jan 27 15:39:42 crc kubenswrapper[4772]: E0127 15:39:42.162839 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75\": container with ID starting with 098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75 not found: ID does not exist" containerID="098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.162914 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75"} err="failed to get container status \"098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75\": rpc error: code = NotFound desc = could not find container \"098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75\": container with ID starting with 098817128bd0191e6d2a586cabdbaf5ba95e52947cc3cb768cf31df336960f75 not found: ID does not exist" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.162973 4772 scope.go:117] "RemoveContainer" containerID="cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c" Jan 27 15:39:42 crc kubenswrapper[4772]: E0127 15:39:42.163618 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c\": container with ID starting with cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c not found: ID does not exist" containerID="cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.163659 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c"} err="failed to get container status \"cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c\": rpc error: code = NotFound desc = could not find container \"cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c\": container with ID starting with cef5d3c9dde46ed19fedc1688b6a3a0442d90768aa13b5b1c866559b791cff2c not found: ID does not exist" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.163690 4772 scope.go:117] "RemoveContainer" containerID="096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7" Jan 27 15:39:42 crc kubenswrapper[4772]: E0127 15:39:42.164270 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7\": container with ID starting with 096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7 not found: ID does not exist" containerID="096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.164342 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7"} err="failed to get container status \"096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7\": rpc error: code = NotFound desc = could not find container \"096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7\": container with ID starting with 096fb8c1b8d3fbe84b8c8656f135ae1fef6771b626847166ee451c948bc66ce7 not found: ID does not exist" Jan 27 15:39:42 crc kubenswrapper[4772]: I0127 15:39:42.672317 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" path="/var/lib/kubelet/pods/f3e87a75-cf2d-4ee1-b89b-75cbce1a3041/volumes" Jan 27 15:40:12 crc kubenswrapper[4772]: I0127 15:40:12.058288 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:40:12 crc kubenswrapper[4772]: I0127 15:40:12.059039 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:40:42 crc kubenswrapper[4772]: I0127 15:40:42.058864 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:40:42 crc kubenswrapper[4772]: I0127 15:40:42.059801 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.058249 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.059235 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.059319 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.060468 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.060553 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52" gracePeriod=600 Jan 27 15:41:12 crc kubenswrapper[4772]: E0127 15:41:12.181074 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67794a44_d793_4fd7_9e54_e40437f67c0b.slice/crio-7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67794a44_d793_4fd7_9e54_e40437f67c0b.slice/crio-conmon-7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52.scope\": RecentStats: unable to find data in memory cache]" Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.788022 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52"} Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.787953 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52" exitCode=0 Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.788134 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d"} Jan 27 15:41:12 crc kubenswrapper[4772]: I0127 15:41:12.788109 4772 scope.go:117] "RemoveContainer" containerID="b0ae39c80720edbba923270ddb9a5ec4d4548e971f6133e3594454030be573c2" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.467919 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dkq6v"] Jan 27 15:41:17 crc kubenswrapper[4772]: E0127 15:41:17.468970 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="registry-server" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.468989 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="registry-server" Jan 27 15:41:17 crc kubenswrapper[4772]: E0127 15:41:17.469006 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="extract-utilities" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.469014 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="extract-utilities" Jan 27 15:41:17 crc kubenswrapper[4772]: E0127 15:41:17.469033 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="extract-content" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.469043 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="extract-content" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.469215 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3e87a75-cf2d-4ee1-b89b-75cbce1a3041" containerName="registry-server" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.470525 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.482536 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkq6v"] Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.586646 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-catalog-content\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.586709 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-utilities\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.586815 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95fz9\" (UniqueName: \"kubernetes.io/projected/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-kube-api-access-95fz9\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.688042 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-catalog-content\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.688126 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-utilities\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.688158 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95fz9\" (UniqueName: \"kubernetes.io/projected/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-kube-api-access-95fz9\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.688769 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-utilities\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.688769 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-catalog-content\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.711397 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95fz9\" (UniqueName: \"kubernetes.io/projected/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-kube-api-access-95fz9\") pod \"redhat-marketplace-dkq6v\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:17 crc kubenswrapper[4772]: I0127 15:41:17.790386 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:18 crc kubenswrapper[4772]: I0127 15:41:18.239076 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkq6v"] Jan 27 15:41:18 crc kubenswrapper[4772]: I0127 15:41:18.839927 4772 generic.go:334] "Generic (PLEG): container finished" podID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerID="2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194" exitCode=0 Jan 27 15:41:18 crc kubenswrapper[4772]: I0127 15:41:18.840008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkq6v" event={"ID":"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8","Type":"ContainerDied","Data":"2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194"} Jan 27 15:41:18 crc kubenswrapper[4772]: I0127 15:41:18.840241 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkq6v" event={"ID":"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8","Type":"ContainerStarted","Data":"c1a4e1a540a2765632567eb226c9e18c795fcbbcddf9a726b1cc0c6e091e2a13"} Jan 27 15:41:18 crc kubenswrapper[4772]: I0127 15:41:18.843828 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:41:20 crc kubenswrapper[4772]: I0127 15:41:20.875038 4772 generic.go:334] "Generic (PLEG): container finished" podID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerID="ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f" exitCode=0 Jan 27 15:41:20 crc kubenswrapper[4772]: I0127 15:41:20.875391 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkq6v" event={"ID":"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8","Type":"ContainerDied","Data":"ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f"} Jan 27 15:41:21 crc kubenswrapper[4772]: I0127 15:41:21.889377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkq6v" event={"ID":"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8","Type":"ContainerStarted","Data":"1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5"} Jan 27 15:41:21 crc kubenswrapper[4772]: I0127 15:41:21.912258 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dkq6v" podStartSLOduration=2.132035986 podStartE2EDuration="4.91223516s" podCreationTimestamp="2026-01-27 15:41:17 +0000 UTC" firstStartedPulling="2026-01-27 15:41:18.843567656 +0000 UTC m=+2064.824176754" lastFinishedPulling="2026-01-27 15:41:21.62376683 +0000 UTC m=+2067.604375928" observedRunningTime="2026-01-27 15:41:21.908134252 +0000 UTC m=+2067.888743350" watchObservedRunningTime="2026-01-27 15:41:21.91223516 +0000 UTC m=+2067.892844268" Jan 27 15:41:27 crc kubenswrapper[4772]: I0127 15:41:27.790771 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:27 crc kubenswrapper[4772]: I0127 15:41:27.791381 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:27 crc kubenswrapper[4772]: I0127 15:41:27.833023 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:27 crc kubenswrapper[4772]: I0127 15:41:27.968968 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:28 crc kubenswrapper[4772]: I0127 15:41:28.066303 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkq6v"] Jan 27 15:41:29 crc kubenswrapper[4772]: I0127 15:41:29.943295 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dkq6v" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="registry-server" containerID="cri-o://1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5" gracePeriod=2 Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.437063 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.576626 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-catalog-content\") pod \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.576798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95fz9\" (UniqueName: \"kubernetes.io/projected/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-kube-api-access-95fz9\") pod \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.576972 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-utilities\") pod \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\" (UID: \"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8\") " Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.578455 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-utilities" (OuterVolumeSpecName: "utilities") pod "fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" (UID: "fb350c6d-0ecc-4fa7-8270-36a7589aa9a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.588650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-kube-api-access-95fz9" (OuterVolumeSpecName: "kube-api-access-95fz9") pod "fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" (UID: "fb350c6d-0ecc-4fa7-8270-36a7589aa9a8"). InnerVolumeSpecName "kube-api-access-95fz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.604078 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" (UID: "fb350c6d-0ecc-4fa7-8270-36a7589aa9a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.678946 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.678993 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.679006 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95fz9\" (UniqueName: \"kubernetes.io/projected/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8-kube-api-access-95fz9\") on node \"crc\" DevicePath \"\"" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.952968 4772 generic.go:334] "Generic (PLEG): container finished" podID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerID="1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5" exitCode=0 Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.953015 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkq6v" event={"ID":"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8","Type":"ContainerDied","Data":"1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5"} Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.953048 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkq6v" event={"ID":"fb350c6d-0ecc-4fa7-8270-36a7589aa9a8","Type":"ContainerDied","Data":"c1a4e1a540a2765632567eb226c9e18c795fcbbcddf9a726b1cc0c6e091e2a13"} Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.953066 4772 scope.go:117] "RemoveContainer" containerID="1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.953081 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkq6v" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.977839 4772 scope.go:117] "RemoveContainer" containerID="ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f" Jan 27 15:41:30 crc kubenswrapper[4772]: I0127 15:41:30.993410 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkq6v"] Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.001197 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkq6v"] Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.017483 4772 scope.go:117] "RemoveContainer" containerID="2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194" Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.039903 4772 scope.go:117] "RemoveContainer" containerID="1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5" Jan 27 15:41:31 crc kubenswrapper[4772]: E0127 15:41:31.040423 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5\": container with ID starting with 1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5 not found: ID does not exist" containerID="1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5" Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.040471 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5"} err="failed to get container status \"1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5\": rpc error: code = NotFound desc = could not find container \"1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5\": container with ID starting with 1109392b5c2e2da59eacb68ca1039f67700720c68032965782bdee89b041c0f5 not found: ID does not exist" Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.040502 4772 scope.go:117] "RemoveContainer" containerID="ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f" Jan 27 15:41:31 crc kubenswrapper[4772]: E0127 15:41:31.040914 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f\": container with ID starting with ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f not found: ID does not exist" containerID="ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f" Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.040946 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f"} err="failed to get container status \"ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f\": rpc error: code = NotFound desc = could not find container \"ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f\": container with ID starting with ae87e00dd314123fd500107d9147be5ee94d0ef022bbaadd406d05a97d92ce2f not found: ID does not exist" Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.040970 4772 scope.go:117] "RemoveContainer" containerID="2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194" Jan 27 15:41:31 crc kubenswrapper[4772]: E0127 15:41:31.041274 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194\": container with ID starting with 2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194 not found: ID does not exist" containerID="2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194" Jan 27 15:41:31 crc kubenswrapper[4772]: I0127 15:41:31.041298 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194"} err="failed to get container status \"2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194\": rpc error: code = NotFound desc = could not find container \"2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194\": container with ID starting with 2734d7c6099c2b586983d9f79bc0e7eba2fc0c635ae80160a19b4cfe53a27194 not found: ID does not exist" Jan 27 15:41:32 crc kubenswrapper[4772]: I0127 15:41:32.688657 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" path="/var/lib/kubelet/pods/fb350c6d-0ecc-4fa7-8270-36a7589aa9a8/volumes" Jan 27 15:43:12 crc kubenswrapper[4772]: I0127 15:43:12.058350 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:43:12 crc kubenswrapper[4772]: I0127 15:43:12.059130 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:43:42 crc kubenswrapper[4772]: I0127 15:43:42.058276 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:43:42 crc kubenswrapper[4772]: I0127 15:43:42.058883 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:44:12 crc kubenswrapper[4772]: I0127 15:44:12.058494 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:44:12 crc kubenswrapper[4772]: I0127 15:44:12.059088 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:44:12 crc kubenswrapper[4772]: I0127 15:44:12.059142 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:44:12 crc kubenswrapper[4772]: I0127 15:44:12.059836 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:44:12 crc kubenswrapper[4772]: I0127 15:44:12.059896 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" gracePeriod=600 Jan 27 15:44:12 crc kubenswrapper[4772]: E0127 15:44:12.199948 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:44:13 crc kubenswrapper[4772]: I0127 15:44:13.143935 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" exitCode=0 Jan 27 15:44:13 crc kubenswrapper[4772]: I0127 15:44:13.143983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d"} Jan 27 15:44:13 crc kubenswrapper[4772]: I0127 15:44:13.144018 4772 scope.go:117] "RemoveContainer" containerID="7b761d6bc884bc3f2c2a56d23c68b1cd3740e77326d275aa98b5b71fcaad6f52" Jan 27 15:44:13 crc kubenswrapper[4772]: I0127 15:44:13.144613 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:44:13 crc kubenswrapper[4772]: E0127 15:44:13.144837 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.139291 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qsdqm"] Jan 27 15:44:20 crc kubenswrapper[4772]: E0127 15:44:20.140075 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="extract-utilities" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.140087 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="extract-utilities" Jan 27 15:44:20 crc kubenswrapper[4772]: E0127 15:44:20.140104 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="registry-server" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.140111 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="registry-server" Jan 27 15:44:20 crc kubenswrapper[4772]: E0127 15:44:20.140125 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="extract-content" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.140131 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="extract-content" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.140315 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb350c6d-0ecc-4fa7-8270-36a7589aa9a8" containerName="registry-server" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.141309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.148699 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qsdqm"] Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.209530 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-catalog-content\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.209628 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-utilities\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.209800 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrknc\" (UniqueName: \"kubernetes.io/projected/4f233b51-cb2c-420c-8041-51f37d626af8-kube-api-access-hrknc\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.310746 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-utilities\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.310876 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrknc\" (UniqueName: \"kubernetes.io/projected/4f233b51-cb2c-420c-8041-51f37d626af8-kube-api-access-hrknc\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.310899 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-catalog-content\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.311412 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-catalog-content\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.312348 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-utilities\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.329502 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrknc\" (UniqueName: \"kubernetes.io/projected/4f233b51-cb2c-420c-8041-51f37d626af8-kube-api-access-hrknc\") pod \"certified-operators-qsdqm\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.467293 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:20 crc kubenswrapper[4772]: I0127 15:44:20.728460 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qsdqm"] Jan 27 15:44:21 crc kubenswrapper[4772]: I0127 15:44:21.207136 4772 generic.go:334] "Generic (PLEG): container finished" podID="4f233b51-cb2c-420c-8041-51f37d626af8" containerID="efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725" exitCode=0 Jan 27 15:44:21 crc kubenswrapper[4772]: I0127 15:44:21.207200 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsdqm" event={"ID":"4f233b51-cb2c-420c-8041-51f37d626af8","Type":"ContainerDied","Data":"efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725"} Jan 27 15:44:21 crc kubenswrapper[4772]: I0127 15:44:21.207248 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsdqm" event={"ID":"4f233b51-cb2c-420c-8041-51f37d626af8","Type":"ContainerStarted","Data":"cf55e1a8d42200dea1c6edfd0f692b2f2dd7a9c0337a4be623c23a3c32885937"} Jan 27 15:44:22 crc kubenswrapper[4772]: I0127 15:44:22.214997 4772 generic.go:334] "Generic (PLEG): container finished" podID="4f233b51-cb2c-420c-8041-51f37d626af8" containerID="3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b" exitCode=0 Jan 27 15:44:22 crc kubenswrapper[4772]: I0127 15:44:22.215041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsdqm" event={"ID":"4f233b51-cb2c-420c-8041-51f37d626af8","Type":"ContainerDied","Data":"3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b"} Jan 27 15:44:23 crc kubenswrapper[4772]: I0127 15:44:23.224145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsdqm" event={"ID":"4f233b51-cb2c-420c-8041-51f37d626af8","Type":"ContainerStarted","Data":"4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da"} Jan 27 15:44:23 crc kubenswrapper[4772]: I0127 15:44:23.241569 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qsdqm" podStartSLOduration=1.748532371 podStartE2EDuration="3.241547665s" podCreationTimestamp="2026-01-27 15:44:20 +0000 UTC" firstStartedPulling="2026-01-27 15:44:21.208837046 +0000 UTC m=+2247.189446144" lastFinishedPulling="2026-01-27 15:44:22.70185234 +0000 UTC m=+2248.682461438" observedRunningTime="2026-01-27 15:44:23.239000083 +0000 UTC m=+2249.219609191" watchObservedRunningTime="2026-01-27 15:44:23.241547665 +0000 UTC m=+2249.222156764" Jan 27 15:44:26 crc kubenswrapper[4772]: I0127 15:44:26.663377 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:44:26 crc kubenswrapper[4772]: E0127 15:44:26.664099 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:44:30 crc kubenswrapper[4772]: I0127 15:44:30.468468 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:30 crc kubenswrapper[4772]: I0127 15:44:30.468526 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:30 crc kubenswrapper[4772]: I0127 15:44:30.507150 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:31 crc kubenswrapper[4772]: I0127 15:44:31.320488 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:31 crc kubenswrapper[4772]: I0127 15:44:31.371235 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qsdqm"] Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.294016 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qsdqm" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="registry-server" containerID="cri-o://4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da" gracePeriod=2 Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.666848 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.800201 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-utilities\") pod \"4f233b51-cb2c-420c-8041-51f37d626af8\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.800240 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-catalog-content\") pod \"4f233b51-cb2c-420c-8041-51f37d626af8\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.800333 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrknc\" (UniqueName: \"kubernetes.io/projected/4f233b51-cb2c-420c-8041-51f37d626af8-kube-api-access-hrknc\") pod \"4f233b51-cb2c-420c-8041-51f37d626af8\" (UID: \"4f233b51-cb2c-420c-8041-51f37d626af8\") " Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.801538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-utilities" (OuterVolumeSpecName: "utilities") pod "4f233b51-cb2c-420c-8041-51f37d626af8" (UID: "4f233b51-cb2c-420c-8041-51f37d626af8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.802308 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.805393 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f233b51-cb2c-420c-8041-51f37d626af8-kube-api-access-hrknc" (OuterVolumeSpecName: "kube-api-access-hrknc") pod "4f233b51-cb2c-420c-8041-51f37d626af8" (UID: "4f233b51-cb2c-420c-8041-51f37d626af8"). InnerVolumeSpecName "kube-api-access-hrknc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.849930 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f233b51-cb2c-420c-8041-51f37d626af8" (UID: "4f233b51-cb2c-420c-8041-51f37d626af8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.903991 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f233b51-cb2c-420c-8041-51f37d626af8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:44:33 crc kubenswrapper[4772]: I0127 15:44:33.904023 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrknc\" (UniqueName: \"kubernetes.io/projected/4f233b51-cb2c-420c-8041-51f37d626af8-kube-api-access-hrknc\") on node \"crc\" DevicePath \"\"" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.304829 4772 generic.go:334] "Generic (PLEG): container finished" podID="4f233b51-cb2c-420c-8041-51f37d626af8" containerID="4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da" exitCode=0 Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.304875 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsdqm" event={"ID":"4f233b51-cb2c-420c-8041-51f37d626af8","Type":"ContainerDied","Data":"4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da"} Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.304903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qsdqm" event={"ID":"4f233b51-cb2c-420c-8041-51f37d626af8","Type":"ContainerDied","Data":"cf55e1a8d42200dea1c6edfd0f692b2f2dd7a9c0337a4be623c23a3c32885937"} Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.304922 4772 scope.go:117] "RemoveContainer" containerID="4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.305049 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qsdqm" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.334011 4772 scope.go:117] "RemoveContainer" containerID="3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.347012 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qsdqm"] Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.358496 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qsdqm"] Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.374516 4772 scope.go:117] "RemoveContainer" containerID="efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.392606 4772 scope.go:117] "RemoveContainer" containerID="4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da" Jan 27 15:44:34 crc kubenswrapper[4772]: E0127 15:44:34.393710 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da\": container with ID starting with 4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da not found: ID does not exist" containerID="4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.393747 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da"} err="failed to get container status \"4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da\": rpc error: code = NotFound desc = could not find container \"4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da\": container with ID starting with 4ce816182821da35d33be57a085da39c9beb9bdfd04fd5d55ee933fbdbd497da not found: ID does not exist" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.393770 4772 scope.go:117] "RemoveContainer" containerID="3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b" Jan 27 15:44:34 crc kubenswrapper[4772]: E0127 15:44:34.394156 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b\": container with ID starting with 3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b not found: ID does not exist" containerID="3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.394189 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b"} err="failed to get container status \"3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b\": rpc error: code = NotFound desc = could not find container \"3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b\": container with ID starting with 3b3133f4d862cd6fbf455830ef984b383c19cfdc69480ebb3bf4d964e5a3bf9b not found: ID does not exist" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.394201 4772 scope.go:117] "RemoveContainer" containerID="efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725" Jan 27 15:44:34 crc kubenswrapper[4772]: E0127 15:44:34.394453 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725\": container with ID starting with efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725 not found: ID does not exist" containerID="efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.394472 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725"} err="failed to get container status \"efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725\": rpc error: code = NotFound desc = could not find container \"efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725\": container with ID starting with efa341e571e74e65be29aaef4632650e18ccae6a662231c4759328298a83b725 not found: ID does not exist" Jan 27 15:44:34 crc kubenswrapper[4772]: I0127 15:44:34.679763 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" path="/var/lib/kubelet/pods/4f233b51-cb2c-420c-8041-51f37d626af8/volumes" Jan 27 15:44:37 crc kubenswrapper[4772]: I0127 15:44:37.662866 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:44:37 crc kubenswrapper[4772]: E0127 15:44:37.663110 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:44:50 crc kubenswrapper[4772]: I0127 15:44:50.663449 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:44:50 crc kubenswrapper[4772]: E0127 15:44:50.664221 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.147728 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd"] Jan 27 15:45:00 crc kubenswrapper[4772]: E0127 15:45:00.148930 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="extract-content" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.148948 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="extract-content" Jan 27 15:45:00 crc kubenswrapper[4772]: E0127 15:45:00.148961 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="registry-server" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.148972 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="registry-server" Jan 27 15:45:00 crc kubenswrapper[4772]: E0127 15:45:00.148986 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="extract-utilities" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.148993 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="extract-utilities" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.149159 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f233b51-cb2c-420c-8041-51f37d626af8" containerName="registry-server" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.149779 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.152873 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.152920 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.178779 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd"] Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.292485 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d288b\" (UniqueName: \"kubernetes.io/projected/df82c0c4-9652-407e-b63d-17e2ccdb38aa-kube-api-access-d288b\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.292812 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df82c0c4-9652-407e-b63d-17e2ccdb38aa-config-volume\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.292929 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df82c0c4-9652-407e-b63d-17e2ccdb38aa-secret-volume\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.394305 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d288b\" (UniqueName: \"kubernetes.io/projected/df82c0c4-9652-407e-b63d-17e2ccdb38aa-kube-api-access-d288b\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.394634 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df82c0c4-9652-407e-b63d-17e2ccdb38aa-config-volume\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.394776 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df82c0c4-9652-407e-b63d-17e2ccdb38aa-secret-volume\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.395570 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df82c0c4-9652-407e-b63d-17e2ccdb38aa-config-volume\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.407295 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df82c0c4-9652-407e-b63d-17e2ccdb38aa-secret-volume\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.414589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d288b\" (UniqueName: \"kubernetes.io/projected/df82c0c4-9652-407e-b63d-17e2ccdb38aa-kube-api-access-d288b\") pod \"collect-profiles-29492145-st6dd\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.522842 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:00 crc kubenswrapper[4772]: I0127 15:45:00.760664 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd"] Jan 27 15:45:01 crc kubenswrapper[4772]: I0127 15:45:01.484823 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" event={"ID":"df82c0c4-9652-407e-b63d-17e2ccdb38aa","Type":"ContainerStarted","Data":"72ea0a33955c0509b888997e5b6ca0dc68de786a608fe5aae9035bbbf84ae773"} Jan 27 15:45:01 crc kubenswrapper[4772]: I0127 15:45:01.485317 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" event={"ID":"df82c0c4-9652-407e-b63d-17e2ccdb38aa","Type":"ContainerStarted","Data":"6e8b24753000f1a5886c84ae46b17b89b02042bde799c38ca5eda8c6b7c07dde"} Jan 27 15:45:01 crc kubenswrapper[4772]: I0127 15:45:01.507435 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" podStartSLOduration=1.507416612 podStartE2EDuration="1.507416612s" podCreationTimestamp="2026-01-27 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:45:01.50204835 +0000 UTC m=+2287.482657448" watchObservedRunningTime="2026-01-27 15:45:01.507416612 +0000 UTC m=+2287.488025710" Jan 27 15:45:02 crc kubenswrapper[4772]: I0127 15:45:02.493759 4772 generic.go:334] "Generic (PLEG): container finished" podID="df82c0c4-9652-407e-b63d-17e2ccdb38aa" containerID="72ea0a33955c0509b888997e5b6ca0dc68de786a608fe5aae9035bbbf84ae773" exitCode=0 Jan 27 15:45:02 crc kubenswrapper[4772]: I0127 15:45:02.493810 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" event={"ID":"df82c0c4-9652-407e-b63d-17e2ccdb38aa","Type":"ContainerDied","Data":"72ea0a33955c0509b888997e5b6ca0dc68de786a608fe5aae9035bbbf84ae773"} Jan 27 15:45:02 crc kubenswrapper[4772]: I0127 15:45:02.663209 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:45:02 crc kubenswrapper[4772]: E0127 15:45:02.663496 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.767238 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.846476 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df82c0c4-9652-407e-b63d-17e2ccdb38aa-config-volume\") pod \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.846661 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df82c0c4-9652-407e-b63d-17e2ccdb38aa-secret-volume\") pod \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.846755 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d288b\" (UniqueName: \"kubernetes.io/projected/df82c0c4-9652-407e-b63d-17e2ccdb38aa-kube-api-access-d288b\") pod \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\" (UID: \"df82c0c4-9652-407e-b63d-17e2ccdb38aa\") " Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.847943 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df82c0c4-9652-407e-b63d-17e2ccdb38aa-config-volume" (OuterVolumeSpecName: "config-volume") pod "df82c0c4-9652-407e-b63d-17e2ccdb38aa" (UID: "df82c0c4-9652-407e-b63d-17e2ccdb38aa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.853479 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df82c0c4-9652-407e-b63d-17e2ccdb38aa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df82c0c4-9652-407e-b63d-17e2ccdb38aa" (UID: "df82c0c4-9652-407e-b63d-17e2ccdb38aa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.853901 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df82c0c4-9652-407e-b63d-17e2ccdb38aa-kube-api-access-d288b" (OuterVolumeSpecName: "kube-api-access-d288b") pod "df82c0c4-9652-407e-b63d-17e2ccdb38aa" (UID: "df82c0c4-9652-407e-b63d-17e2ccdb38aa"). InnerVolumeSpecName "kube-api-access-d288b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.948815 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df82c0c4-9652-407e-b63d-17e2ccdb38aa-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.948863 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df82c0c4-9652-407e-b63d-17e2ccdb38aa-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:45:03 crc kubenswrapper[4772]: I0127 15:45:03.948878 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d288b\" (UniqueName: \"kubernetes.io/projected/df82c0c4-9652-407e-b63d-17e2ccdb38aa-kube-api-access-d288b\") on node \"crc\" DevicePath \"\"" Jan 27 15:45:04 crc kubenswrapper[4772]: I0127 15:45:04.509016 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" event={"ID":"df82c0c4-9652-407e-b63d-17e2ccdb38aa","Type":"ContainerDied","Data":"6e8b24753000f1a5886c84ae46b17b89b02042bde799c38ca5eda8c6b7c07dde"} Jan 27 15:45:04 crc kubenswrapper[4772]: I0127 15:45:04.509075 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e8b24753000f1a5886c84ae46b17b89b02042bde799c38ca5eda8c6b7c07dde" Jan 27 15:45:04 crc kubenswrapper[4772]: I0127 15:45:04.509096 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd" Jan 27 15:45:04 crc kubenswrapper[4772]: I0127 15:45:04.583094 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6"] Jan 27 15:45:04 crc kubenswrapper[4772]: I0127 15:45:04.589656 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492100-r2zj6"] Jan 27 15:45:04 crc kubenswrapper[4772]: I0127 15:45:04.672960 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6b54ae2-d365-4988-8e69-704574c7962a" path="/var/lib/kubelet/pods/c6b54ae2-d365-4988-8e69-704574c7962a/volumes" Jan 27 15:45:05 crc kubenswrapper[4772]: I0127 15:45:05.385539 4772 scope.go:117] "RemoveContainer" containerID="4f5ed02624877f82608d4a7a5fead892a80497d0b63bf729eaa6c0d56cf6aac6" Jan 27 15:45:16 crc kubenswrapper[4772]: I0127 15:45:16.663197 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:45:16 crc kubenswrapper[4772]: E0127 15:45:16.664056 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:45:28 crc kubenswrapper[4772]: I0127 15:45:28.664093 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:45:28 crc kubenswrapper[4772]: E0127 15:45:28.665785 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:45:41 crc kubenswrapper[4772]: I0127 15:45:41.663628 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:45:41 crc kubenswrapper[4772]: E0127 15:45:41.665618 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:45:53 crc kubenswrapper[4772]: I0127 15:45:53.663444 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:45:53 crc kubenswrapper[4772]: E0127 15:45:53.664104 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:46:04 crc kubenswrapper[4772]: I0127 15:46:04.667523 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:46:04 crc kubenswrapper[4772]: E0127 15:46:04.668285 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:46:16 crc kubenswrapper[4772]: I0127 15:46:16.664193 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:46:16 crc kubenswrapper[4772]: E0127 15:46:16.665283 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:46:29 crc kubenswrapper[4772]: I0127 15:46:29.664705 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:46:29 crc kubenswrapper[4772]: E0127 15:46:29.666004 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.772479 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n5mdv"] Jan 27 15:46:31 crc kubenswrapper[4772]: E0127 15:46:31.773215 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df82c0c4-9652-407e-b63d-17e2ccdb38aa" containerName="collect-profiles" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.773232 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="df82c0c4-9652-407e-b63d-17e2ccdb38aa" containerName="collect-profiles" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.773397 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="df82c0c4-9652-407e-b63d-17e2ccdb38aa" containerName="collect-profiles" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.774727 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.786248 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5mdv"] Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.824126 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-catalog-content\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.824233 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-utilities\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.824269 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92bc9\" (UniqueName: \"kubernetes.io/projected/ced76afc-fd78-450c-b74e-dec420ed75db-kube-api-access-92bc9\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.925254 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-utilities\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.925312 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92bc9\" (UniqueName: \"kubernetes.io/projected/ced76afc-fd78-450c-b74e-dec420ed75db-kube-api-access-92bc9\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.925345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-catalog-content\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.925757 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-utilities\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.925799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-catalog-content\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:31 crc kubenswrapper[4772]: I0127 15:46:31.946266 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92bc9\" (UniqueName: \"kubernetes.io/projected/ced76afc-fd78-450c-b74e-dec420ed75db-kube-api-access-92bc9\") pod \"community-operators-n5mdv\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:32 crc kubenswrapper[4772]: I0127 15:46:32.139690 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:32 crc kubenswrapper[4772]: I0127 15:46:32.626626 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5mdv"] Jan 27 15:46:33 crc kubenswrapper[4772]: I0127 15:46:33.149333 4772 generic.go:334] "Generic (PLEG): container finished" podID="ced76afc-fd78-450c-b74e-dec420ed75db" containerID="8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa" exitCode=0 Jan 27 15:46:33 crc kubenswrapper[4772]: I0127 15:46:33.149474 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5mdv" event={"ID":"ced76afc-fd78-450c-b74e-dec420ed75db","Type":"ContainerDied","Data":"8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa"} Jan 27 15:46:33 crc kubenswrapper[4772]: I0127 15:46:33.149689 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5mdv" event={"ID":"ced76afc-fd78-450c-b74e-dec420ed75db","Type":"ContainerStarted","Data":"b82ef5fd5075c8859afe639fb3cfeb3d073b576a10725fee36633021a1033fa0"} Jan 27 15:46:33 crc kubenswrapper[4772]: I0127 15:46:33.152222 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:46:35 crc kubenswrapper[4772]: I0127 15:46:35.165815 4772 generic.go:334] "Generic (PLEG): container finished" podID="ced76afc-fd78-450c-b74e-dec420ed75db" containerID="c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9" exitCode=0 Jan 27 15:46:35 crc kubenswrapper[4772]: I0127 15:46:35.166016 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5mdv" event={"ID":"ced76afc-fd78-450c-b74e-dec420ed75db","Type":"ContainerDied","Data":"c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9"} Jan 27 15:46:36 crc kubenswrapper[4772]: I0127 15:46:36.176814 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5mdv" event={"ID":"ced76afc-fd78-450c-b74e-dec420ed75db","Type":"ContainerStarted","Data":"c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0"} Jan 27 15:46:36 crc kubenswrapper[4772]: I0127 15:46:36.194275 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n5mdv" podStartSLOduration=2.741323146 podStartE2EDuration="5.194255754s" podCreationTimestamp="2026-01-27 15:46:31 +0000 UTC" firstStartedPulling="2026-01-27 15:46:33.151908311 +0000 UTC m=+2379.132517409" lastFinishedPulling="2026-01-27 15:46:35.604840919 +0000 UTC m=+2381.585450017" observedRunningTime="2026-01-27 15:46:36.190824667 +0000 UTC m=+2382.171433765" watchObservedRunningTime="2026-01-27 15:46:36.194255754 +0000 UTC m=+2382.174864852" Jan 27 15:46:40 crc kubenswrapper[4772]: I0127 15:46:40.663374 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:46:40 crc kubenswrapper[4772]: E0127 15:46:40.664101 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:46:42 crc kubenswrapper[4772]: I0127 15:46:42.140785 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:42 crc kubenswrapper[4772]: I0127 15:46:42.141103 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:42 crc kubenswrapper[4772]: I0127 15:46:42.198857 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:42 crc kubenswrapper[4772]: I0127 15:46:42.272105 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:42 crc kubenswrapper[4772]: I0127 15:46:42.439832 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5mdv"] Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.237387 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n5mdv" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="registry-server" containerID="cri-o://c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0" gracePeriod=2 Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.587963 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.617028 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-utilities\") pod \"ced76afc-fd78-450c-b74e-dec420ed75db\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.617080 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92bc9\" (UniqueName: \"kubernetes.io/projected/ced76afc-fd78-450c-b74e-dec420ed75db-kube-api-access-92bc9\") pod \"ced76afc-fd78-450c-b74e-dec420ed75db\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.617150 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-catalog-content\") pod \"ced76afc-fd78-450c-b74e-dec420ed75db\" (UID: \"ced76afc-fd78-450c-b74e-dec420ed75db\") " Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.619130 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-utilities" (OuterVolumeSpecName: "utilities") pod "ced76afc-fd78-450c-b74e-dec420ed75db" (UID: "ced76afc-fd78-450c-b74e-dec420ed75db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.627037 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced76afc-fd78-450c-b74e-dec420ed75db-kube-api-access-92bc9" (OuterVolumeSpecName: "kube-api-access-92bc9") pod "ced76afc-fd78-450c-b74e-dec420ed75db" (UID: "ced76afc-fd78-450c-b74e-dec420ed75db"). InnerVolumeSpecName "kube-api-access-92bc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.682596 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ced76afc-fd78-450c-b74e-dec420ed75db" (UID: "ced76afc-fd78-450c-b74e-dec420ed75db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.718575 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92bc9\" (UniqueName: \"kubernetes.io/projected/ced76afc-fd78-450c-b74e-dec420ed75db-kube-api-access-92bc9\") on node \"crc\" DevicePath \"\"" Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.718604 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:46:44 crc kubenswrapper[4772]: I0127 15:46:44.718613 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ced76afc-fd78-450c-b74e-dec420ed75db-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.250368 4772 generic.go:334] "Generic (PLEG): container finished" podID="ced76afc-fd78-450c-b74e-dec420ed75db" containerID="c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0" exitCode=0 Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.250494 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5mdv" event={"ID":"ced76afc-fd78-450c-b74e-dec420ed75db","Type":"ContainerDied","Data":"c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0"} Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.252038 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5mdv" event={"ID":"ced76afc-fd78-450c-b74e-dec420ed75db","Type":"ContainerDied","Data":"b82ef5fd5075c8859afe639fb3cfeb3d073b576a10725fee36633021a1033fa0"} Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.250653 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5mdv" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.252078 4772 scope.go:117] "RemoveContainer" containerID="c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.276526 4772 scope.go:117] "RemoveContainer" containerID="c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.312398 4772 scope.go:117] "RemoveContainer" containerID="8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.362485 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5mdv"] Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.371384 4772 scope.go:117] "RemoveContainer" containerID="c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.372229 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n5mdv"] Jan 27 15:46:45 crc kubenswrapper[4772]: E0127 15:46:45.372510 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0\": container with ID starting with c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0 not found: ID does not exist" containerID="c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.372573 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0"} err="failed to get container status \"c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0\": rpc error: code = NotFound desc = could not find container \"c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0\": container with ID starting with c3b0df09bdf9b7355b490a4af154b0d2e84dbe10db8179c311fd237940d8cbc0 not found: ID does not exist" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.372606 4772 scope.go:117] "RemoveContainer" containerID="c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9" Jan 27 15:46:45 crc kubenswrapper[4772]: E0127 15:46:45.373108 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9\": container with ID starting with c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9 not found: ID does not exist" containerID="c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.373143 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9"} err="failed to get container status \"c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9\": rpc error: code = NotFound desc = could not find container \"c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9\": container with ID starting with c4efac9ce3e6005b64b6af3bdc85f43a1a209171b2708c0685747465c7568cf9 not found: ID does not exist" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.373187 4772 scope.go:117] "RemoveContainer" containerID="8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa" Jan 27 15:46:45 crc kubenswrapper[4772]: E0127 15:46:45.373511 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa\": container with ID starting with 8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa not found: ID does not exist" containerID="8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa" Jan 27 15:46:45 crc kubenswrapper[4772]: I0127 15:46:45.373545 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa"} err="failed to get container status \"8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa\": rpc error: code = NotFound desc = could not find container \"8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa\": container with ID starting with 8ac0fc5d06479293a7d120d508c0b491eefe60dc43603e556115fb79adac9ffa not found: ID does not exist" Jan 27 15:46:46 crc kubenswrapper[4772]: I0127 15:46:46.682444 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" path="/var/lib/kubelet/pods/ced76afc-fd78-450c-b74e-dec420ed75db/volumes" Jan 27 15:46:51 crc kubenswrapper[4772]: I0127 15:46:51.662970 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:46:51 crc kubenswrapper[4772]: E0127 15:46:51.663858 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:47:05 crc kubenswrapper[4772]: I0127 15:47:05.663271 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:47:05 crc kubenswrapper[4772]: E0127 15:47:05.663959 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:47:17 crc kubenswrapper[4772]: I0127 15:47:17.663519 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:47:17 crc kubenswrapper[4772]: E0127 15:47:17.664649 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:47:28 crc kubenswrapper[4772]: I0127 15:47:28.663603 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:47:28 crc kubenswrapper[4772]: E0127 15:47:28.664379 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:47:41 crc kubenswrapper[4772]: I0127 15:47:41.662842 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:47:41 crc kubenswrapper[4772]: E0127 15:47:41.663697 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:47:53 crc kubenswrapper[4772]: I0127 15:47:53.663910 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:47:53 crc kubenswrapper[4772]: E0127 15:47:53.664814 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:48:05 crc kubenswrapper[4772]: I0127 15:48:05.662624 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:48:05 crc kubenswrapper[4772]: E0127 15:48:05.663324 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:48:17 crc kubenswrapper[4772]: I0127 15:48:17.663551 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:48:17 crc kubenswrapper[4772]: E0127 15:48:17.664323 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:48:31 crc kubenswrapper[4772]: I0127 15:48:31.663144 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:48:31 crc kubenswrapper[4772]: E0127 15:48:31.663872 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:48:44 crc kubenswrapper[4772]: I0127 15:48:44.668706 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:48:44 crc kubenswrapper[4772]: E0127 15:48:44.669619 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:48:57 crc kubenswrapper[4772]: I0127 15:48:57.663606 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:48:57 crc kubenswrapper[4772]: E0127 15:48:57.664464 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:49:12 crc kubenswrapper[4772]: I0127 15:49:12.663515 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:49:12 crc kubenswrapper[4772]: I0127 15:49:12.980397 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"882ea0b40217e0829fd486d6dedd680ff982d35f68339f2340bf98c3fe9a0364"} Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.420092 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-659gv"] Jan 27 15:50:19 crc kubenswrapper[4772]: E0127 15:50:19.421047 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="extract-utilities" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.421063 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="extract-utilities" Jan 27 15:50:19 crc kubenswrapper[4772]: E0127 15:50:19.421083 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="extract-content" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.421092 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="extract-content" Jan 27 15:50:19 crc kubenswrapper[4772]: E0127 15:50:19.421113 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="registry-server" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.421121 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="registry-server" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.421330 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced76afc-fd78-450c-b74e-dec420ed75db" containerName="registry-server" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.422478 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.443747 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-659gv"] Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.553187 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-catalog-content\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.553288 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575hx\" (UniqueName: \"kubernetes.io/projected/226b848a-07cc-44c2-abc3-c60d88c569ca-kube-api-access-575hx\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.553343 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-utilities\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.654771 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-catalog-content\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.654891 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575hx\" (UniqueName: \"kubernetes.io/projected/226b848a-07cc-44c2-abc3-c60d88c569ca-kube-api-access-575hx\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.654949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-utilities\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.655554 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-catalog-content\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.655581 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-utilities\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.678162 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575hx\" (UniqueName: \"kubernetes.io/projected/226b848a-07cc-44c2-abc3-c60d88c569ca-kube-api-access-575hx\") pod \"redhat-operators-659gv\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:19 crc kubenswrapper[4772]: I0127 15:50:19.745143 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:20 crc kubenswrapper[4772]: I0127 15:50:20.205386 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-659gv"] Jan 27 15:50:20 crc kubenswrapper[4772]: I0127 15:50:20.459702 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-659gv" event={"ID":"226b848a-07cc-44c2-abc3-c60d88c569ca","Type":"ContainerStarted","Data":"fce9617314a607b31399658547b245df6c440a62125f27475f93010305600c7d"} Jan 27 15:50:21 crc kubenswrapper[4772]: I0127 15:50:21.468693 4772 generic.go:334] "Generic (PLEG): container finished" podID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerID="79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9" exitCode=0 Jan 27 15:50:21 crc kubenswrapper[4772]: I0127 15:50:21.468740 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-659gv" event={"ID":"226b848a-07cc-44c2-abc3-c60d88c569ca","Type":"ContainerDied","Data":"79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9"} Jan 27 15:50:24 crc kubenswrapper[4772]: I0127 15:50:24.520772 4772 generic.go:334] "Generic (PLEG): container finished" podID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerID="585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4" exitCode=0 Jan 27 15:50:24 crc kubenswrapper[4772]: I0127 15:50:24.520860 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-659gv" event={"ID":"226b848a-07cc-44c2-abc3-c60d88c569ca","Type":"ContainerDied","Data":"585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4"} Jan 27 15:50:26 crc kubenswrapper[4772]: I0127 15:50:26.538152 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-659gv" event={"ID":"226b848a-07cc-44c2-abc3-c60d88c569ca","Type":"ContainerStarted","Data":"d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b"} Jan 27 15:50:29 crc kubenswrapper[4772]: I0127 15:50:29.746200 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:29 crc kubenswrapper[4772]: I0127 15:50:29.746559 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:30 crc kubenswrapper[4772]: I0127 15:50:30.797628 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-659gv" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="registry-server" probeResult="failure" output=< Jan 27 15:50:30 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 15:50:30 crc kubenswrapper[4772]: > Jan 27 15:50:39 crc kubenswrapper[4772]: I0127 15:50:39.789153 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:39 crc kubenswrapper[4772]: I0127 15:50:39.814702 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-659gv" podStartSLOduration=16.034314127 podStartE2EDuration="20.814681841s" podCreationTimestamp="2026-01-27 15:50:19 +0000 UTC" firstStartedPulling="2026-01-27 15:50:21.472318495 +0000 UTC m=+2607.452927593" lastFinishedPulling="2026-01-27 15:50:26.252686209 +0000 UTC m=+2612.233295307" observedRunningTime="2026-01-27 15:50:26.56369759 +0000 UTC m=+2612.544306688" watchObservedRunningTime="2026-01-27 15:50:39.814681841 +0000 UTC m=+2625.795290939" Jan 27 15:50:39 crc kubenswrapper[4772]: I0127 15:50:39.832869 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:40 crc kubenswrapper[4772]: I0127 15:50:40.026881 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-659gv"] Jan 27 15:50:41 crc kubenswrapper[4772]: I0127 15:50:41.654322 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-659gv" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="registry-server" containerID="cri-o://d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b" gracePeriod=2 Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.153656 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.278022 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-575hx\" (UniqueName: \"kubernetes.io/projected/226b848a-07cc-44c2-abc3-c60d88c569ca-kube-api-access-575hx\") pod \"226b848a-07cc-44c2-abc3-c60d88c569ca\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.278402 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-utilities\") pod \"226b848a-07cc-44c2-abc3-c60d88c569ca\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.278560 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-catalog-content\") pod \"226b848a-07cc-44c2-abc3-c60d88c569ca\" (UID: \"226b848a-07cc-44c2-abc3-c60d88c569ca\") " Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.279266 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-utilities" (OuterVolumeSpecName: "utilities") pod "226b848a-07cc-44c2-abc3-c60d88c569ca" (UID: "226b848a-07cc-44c2-abc3-c60d88c569ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.288981 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226b848a-07cc-44c2-abc3-c60d88c569ca-kube-api-access-575hx" (OuterVolumeSpecName: "kube-api-access-575hx") pod "226b848a-07cc-44c2-abc3-c60d88c569ca" (UID: "226b848a-07cc-44c2-abc3-c60d88c569ca"). InnerVolumeSpecName "kube-api-access-575hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.380402 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-575hx\" (UniqueName: \"kubernetes.io/projected/226b848a-07cc-44c2-abc3-c60d88c569ca-kube-api-access-575hx\") on node \"crc\" DevicePath \"\"" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.380459 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.416528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "226b848a-07cc-44c2-abc3-c60d88c569ca" (UID: "226b848a-07cc-44c2-abc3-c60d88c569ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.481552 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/226b848a-07cc-44c2-abc3-c60d88c569ca-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.664015 4772 generic.go:334] "Generic (PLEG): container finished" podID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerID="d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b" exitCode=0 Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.664183 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-659gv" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.671298 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-659gv" event={"ID":"226b848a-07cc-44c2-abc3-c60d88c569ca","Type":"ContainerDied","Data":"d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b"} Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.671340 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-659gv" event={"ID":"226b848a-07cc-44c2-abc3-c60d88c569ca","Type":"ContainerDied","Data":"fce9617314a607b31399658547b245df6c440a62125f27475f93010305600c7d"} Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.671362 4772 scope.go:117] "RemoveContainer" containerID="d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.695625 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-659gv"] Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.698639 4772 scope.go:117] "RemoveContainer" containerID="585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.702407 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-659gv"] Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.715113 4772 scope.go:117] "RemoveContainer" containerID="79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.743386 4772 scope.go:117] "RemoveContainer" containerID="d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b" Jan 27 15:50:42 crc kubenswrapper[4772]: E0127 15:50:42.745568 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b\": container with ID starting with d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b not found: ID does not exist" containerID="d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.745641 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b"} err="failed to get container status \"d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b\": rpc error: code = NotFound desc = could not find container \"d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b\": container with ID starting with d19c7c888f9b5a4f918c07659870716735aa11ca8638eaee98485a627be9df3b not found: ID does not exist" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.745686 4772 scope.go:117] "RemoveContainer" containerID="585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4" Jan 27 15:50:42 crc kubenswrapper[4772]: E0127 15:50:42.746139 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4\": container with ID starting with 585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4 not found: ID does not exist" containerID="585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.746200 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4"} err="failed to get container status \"585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4\": rpc error: code = NotFound desc = could not find container \"585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4\": container with ID starting with 585a2a943f1fd491f2e5864b367be29122b001bcdf8da8f664946a2b7086fbc4 not found: ID does not exist" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.746231 4772 scope.go:117] "RemoveContainer" containerID="79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9" Jan 27 15:50:42 crc kubenswrapper[4772]: E0127 15:50:42.746626 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9\": container with ID starting with 79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9 not found: ID does not exist" containerID="79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9" Jan 27 15:50:42 crc kubenswrapper[4772]: I0127 15:50:42.746666 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9"} err="failed to get container status \"79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9\": rpc error: code = NotFound desc = could not find container \"79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9\": container with ID starting with 79a50a5aee22472dfcc6cefd94361dac06f6e4d3df5bd5b2bc7875a667c02ee9 not found: ID does not exist" Jan 27 15:50:44 crc kubenswrapper[4772]: I0127 15:50:44.670475 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" path="/var/lib/kubelet/pods/226b848a-07cc-44c2-abc3-c60d88c569ca/volumes" Jan 27 15:51:12 crc kubenswrapper[4772]: I0127 15:51:12.058766 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:51:12 crc kubenswrapper[4772]: I0127 15:51:12.059449 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:51:42 crc kubenswrapper[4772]: I0127 15:51:42.058661 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:51:42 crc kubenswrapper[4772]: I0127 15:51:42.059305 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.058991 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.059578 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.059631 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.060224 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"882ea0b40217e0829fd486d6dedd680ff982d35f68339f2340bf98c3fe9a0364"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.060276 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://882ea0b40217e0829fd486d6dedd680ff982d35f68339f2340bf98c3fe9a0364" gracePeriod=600 Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.280036 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="882ea0b40217e0829fd486d6dedd680ff982d35f68339f2340bf98c3fe9a0364" exitCode=0 Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.280098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"882ea0b40217e0829fd486d6dedd680ff982d35f68339f2340bf98c3fe9a0364"} Jan 27 15:52:12 crc kubenswrapper[4772]: I0127 15:52:12.280681 4772 scope.go:117] "RemoveContainer" containerID="2d44d6ccc12cd5721067948851620a9f0611d13982269d5631689aef90c34d5d" Jan 27 15:52:13 crc kubenswrapper[4772]: I0127 15:52:13.289990 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51"} Jan 27 15:54:12 crc kubenswrapper[4772]: I0127 15:54:12.059119 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:54:12 crc kubenswrapper[4772]: I0127 15:54:12.059737 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:54:42 crc kubenswrapper[4772]: I0127 15:54:42.058350 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:54:42 crc kubenswrapper[4772]: I0127 15:54:42.059031 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.058749 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.060321 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.060425 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.061583 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.061678 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" gracePeriod=600 Jan 27 15:55:12 crc kubenswrapper[4772]: E0127 15:55:12.185053 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.695953 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" exitCode=0 Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.696006 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51"} Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.696050 4772 scope.go:117] "RemoveContainer" containerID="882ea0b40217e0829fd486d6dedd680ff982d35f68339f2340bf98c3fe9a0364" Jan 27 15:55:12 crc kubenswrapper[4772]: I0127 15:55:12.696762 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:55:12 crc kubenswrapper[4772]: E0127 15:55:12.697068 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:55:26 crc kubenswrapper[4772]: I0127 15:55:26.662872 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:55:26 crc kubenswrapper[4772]: E0127 15:55:26.663505 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:55:39 crc kubenswrapper[4772]: I0127 15:55:39.662539 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:55:39 crc kubenswrapper[4772]: E0127 15:55:39.663209 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:55:54 crc kubenswrapper[4772]: I0127 15:55:54.668622 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:55:54 crc kubenswrapper[4772]: E0127 15:55:54.669840 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:56:08 crc kubenswrapper[4772]: I0127 15:56:08.663472 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:56:08 crc kubenswrapper[4772]: E0127 15:56:08.664347 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:56:20 crc kubenswrapper[4772]: I0127 15:56:20.663332 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:56:20 crc kubenswrapper[4772]: E0127 15:56:20.664201 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:56:35 crc kubenswrapper[4772]: I0127 15:56:35.662392 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:56:35 crc kubenswrapper[4772]: E0127 15:56:35.663078 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:56:48 crc kubenswrapper[4772]: I0127 15:56:48.663153 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:56:48 crc kubenswrapper[4772]: E0127 15:56:48.663847 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:57:00 crc kubenswrapper[4772]: I0127 15:57:00.663559 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:57:00 crc kubenswrapper[4772]: E0127 15:57:00.664799 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:57:11 crc kubenswrapper[4772]: I0127 15:57:11.663128 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:57:11 crc kubenswrapper[4772]: E0127 15:57:11.663870 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:57:25 crc kubenswrapper[4772]: I0127 15:57:25.663256 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:57:25 crc kubenswrapper[4772]: E0127 15:57:25.664105 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:57:36 crc kubenswrapper[4772]: I0127 15:57:36.663329 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:57:36 crc kubenswrapper[4772]: E0127 15:57:36.665768 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:57:49 crc kubenswrapper[4772]: I0127 15:57:49.663145 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:57:49 crc kubenswrapper[4772]: E0127 15:57:49.664488 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:58:00 crc kubenswrapper[4772]: I0127 15:58:00.664141 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:58:00 crc kubenswrapper[4772]: E0127 15:58:00.664988 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:58:12 crc kubenswrapper[4772]: I0127 15:58:12.663060 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:58:12 crc kubenswrapper[4772]: E0127 15:58:12.664061 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:58:26 crc kubenswrapper[4772]: I0127 15:58:26.666797 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:58:26 crc kubenswrapper[4772]: E0127 15:58:26.667569 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:58:40 crc kubenswrapper[4772]: I0127 15:58:40.664033 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:58:40 crc kubenswrapper[4772]: E0127 15:58:40.664887 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:58:54 crc kubenswrapper[4772]: I0127 15:58:54.667067 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:58:54 crc kubenswrapper[4772]: E0127 15:58:54.667830 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:59:06 crc kubenswrapper[4772]: I0127 15:59:06.664190 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:59:06 crc kubenswrapper[4772]: E0127 15:59:06.664772 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:59:18 crc kubenswrapper[4772]: I0127 15:59:18.663951 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:59:18 crc kubenswrapper[4772]: E0127 15:59:18.664735 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:59:29 crc kubenswrapper[4772]: I0127 15:59:29.663483 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:59:29 crc kubenswrapper[4772]: E0127 15:59:29.664345 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:59:43 crc kubenswrapper[4772]: I0127 15:59:43.663495 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:59:43 crc kubenswrapper[4772]: E0127 15:59:43.664368 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 15:59:56 crc kubenswrapper[4772]: I0127 15:59:56.663801 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 15:59:56 crc kubenswrapper[4772]: E0127 15:59:56.665020 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.160359 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr"] Jan 27 16:00:00 crc kubenswrapper[4772]: E0127 16:00:00.161199 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="extract-utilities" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.161218 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="extract-utilities" Jan 27 16:00:00 crc kubenswrapper[4772]: E0127 16:00:00.161236 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="registry-server" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.161243 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="registry-server" Jan 27 16:00:00 crc kubenswrapper[4772]: E0127 16:00:00.161267 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="extract-content" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.161276 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="extract-content" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.161436 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="226b848a-07cc-44c2-abc3-c60d88c569ca" containerName="registry-server" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.161979 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.166452 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.166456 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.170533 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr"] Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.355645 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2488\" (UniqueName: \"kubernetes.io/projected/9b4afc52-82aa-4768-9cc8-5e9236fc4330-kube-api-access-c2488\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.356452 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4afc52-82aa-4768-9cc8-5e9236fc4330-secret-volume\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.356527 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4afc52-82aa-4768-9cc8-5e9236fc4330-config-volume\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.458155 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2488\" (UniqueName: \"kubernetes.io/projected/9b4afc52-82aa-4768-9cc8-5e9236fc4330-kube-api-access-c2488\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.458252 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4afc52-82aa-4768-9cc8-5e9236fc4330-secret-volume\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.458272 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4afc52-82aa-4768-9cc8-5e9236fc4330-config-volume\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.459138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4afc52-82aa-4768-9cc8-5e9236fc4330-config-volume\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.467782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4afc52-82aa-4768-9cc8-5e9236fc4330-secret-volume\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.478372 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2488\" (UniqueName: \"kubernetes.io/projected/9b4afc52-82aa-4768-9cc8-5e9236fc4330-kube-api-access-c2488\") pod \"collect-profiles-29492160-4mmgr\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.484756 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:00 crc kubenswrapper[4772]: I0127 16:00:00.918636 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr"] Jan 27 16:00:01 crc kubenswrapper[4772]: I0127 16:00:01.102683 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" event={"ID":"9b4afc52-82aa-4768-9cc8-5e9236fc4330","Type":"ContainerStarted","Data":"8dd8add741d2c060daa432ff2f192c7a04c82b2eab3360197f22c851ca7bd6c0"} Jan 27 16:00:01 crc kubenswrapper[4772]: I0127 16:00:01.102732 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" event={"ID":"9b4afc52-82aa-4768-9cc8-5e9236fc4330","Type":"ContainerStarted","Data":"4e6de38fa85b71d772074ba8f95e475599369849f60252db8cf1c2b677f38292"} Jan 27 16:00:01 crc kubenswrapper[4772]: I0127 16:00:01.119108 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" podStartSLOduration=1.119093843 podStartE2EDuration="1.119093843s" podCreationTimestamp="2026-01-27 16:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:00:01.116762686 +0000 UTC m=+3187.097371784" watchObservedRunningTime="2026-01-27 16:00:01.119093843 +0000 UTC m=+3187.099702931" Jan 27 16:00:02 crc kubenswrapper[4772]: I0127 16:00:02.113993 4772 generic.go:334] "Generic (PLEG): container finished" podID="9b4afc52-82aa-4768-9cc8-5e9236fc4330" containerID="8dd8add741d2c060daa432ff2f192c7a04c82b2eab3360197f22c851ca7bd6c0" exitCode=0 Jan 27 16:00:02 crc kubenswrapper[4772]: I0127 16:00:02.114115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" event={"ID":"9b4afc52-82aa-4768-9cc8-5e9236fc4330","Type":"ContainerDied","Data":"8dd8add741d2c060daa432ff2f192c7a04c82b2eab3360197f22c851ca7bd6c0"} Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.392433 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.517792 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2488\" (UniqueName: \"kubernetes.io/projected/9b4afc52-82aa-4768-9cc8-5e9236fc4330-kube-api-access-c2488\") pod \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.517859 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4afc52-82aa-4768-9cc8-5e9236fc4330-secret-volume\") pod \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.517883 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4afc52-82aa-4768-9cc8-5e9236fc4330-config-volume\") pod \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\" (UID: \"9b4afc52-82aa-4768-9cc8-5e9236fc4330\") " Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.518629 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4afc52-82aa-4768-9cc8-5e9236fc4330-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b4afc52-82aa-4768-9cc8-5e9236fc4330" (UID: "9b4afc52-82aa-4768-9cc8-5e9236fc4330"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.522663 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4afc52-82aa-4768-9cc8-5e9236fc4330-kube-api-access-c2488" (OuterVolumeSpecName: "kube-api-access-c2488") pod "9b4afc52-82aa-4768-9cc8-5e9236fc4330" (UID: "9b4afc52-82aa-4768-9cc8-5e9236fc4330"). InnerVolumeSpecName "kube-api-access-c2488". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.522908 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4afc52-82aa-4768-9cc8-5e9236fc4330-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b4afc52-82aa-4768-9cc8-5e9236fc4330" (UID: "9b4afc52-82aa-4768-9cc8-5e9236fc4330"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.619480 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2488\" (UniqueName: \"kubernetes.io/projected/9b4afc52-82aa-4768-9cc8-5e9236fc4330-kube-api-access-c2488\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.619534 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b4afc52-82aa-4768-9cc8-5e9236fc4330-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:03 crc kubenswrapper[4772]: I0127 16:00:03.619551 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b4afc52-82aa-4768-9cc8-5e9236fc4330-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:04 crc kubenswrapper[4772]: I0127 16:00:04.128034 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" event={"ID":"9b4afc52-82aa-4768-9cc8-5e9236fc4330","Type":"ContainerDied","Data":"4e6de38fa85b71d772074ba8f95e475599369849f60252db8cf1c2b677f38292"} Jan 27 16:00:04 crc kubenswrapper[4772]: I0127 16:00:04.128086 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e6de38fa85b71d772074ba8f95e475599369849f60252db8cf1c2b677f38292" Jan 27 16:00:04 crc kubenswrapper[4772]: I0127 16:00:04.128123 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr" Jan 27 16:00:04 crc kubenswrapper[4772]: I0127 16:00:04.465278 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g"] Jan 27 16:00:04 crc kubenswrapper[4772]: I0127 16:00:04.469320 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492115-hb89g"] Jan 27 16:00:04 crc kubenswrapper[4772]: I0127 16:00:04.676415 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1" path="/var/lib/kubelet/pods/616d65ac-8b2b-4b0a-b7a1-ca3516ad7cf1/volumes" Jan 27 16:00:05 crc kubenswrapper[4772]: I0127 16:00:05.727763 4772 scope.go:117] "RemoveContainer" containerID="8931f0cc38dd8c453e687a0b65ac6a9c2d9a0265440b30f5480c6ac0483f9860" Jan 27 16:00:08 crc kubenswrapper[4772]: I0127 16:00:08.663262 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 16:00:08 crc kubenswrapper[4772]: E0127 16:00:08.663800 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.372021 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8lq7p"] Jan 27 16:00:09 crc kubenswrapper[4772]: E0127 16:00:09.372449 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4afc52-82aa-4768-9cc8-5e9236fc4330" containerName="collect-profiles" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.372472 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4afc52-82aa-4768-9cc8-5e9236fc4330" containerName="collect-profiles" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.372632 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4afc52-82aa-4768-9cc8-5e9236fc4330" containerName="collect-profiles" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.373858 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.381030 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8lq7p"] Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.411198 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-catalog-content\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.411563 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-utilities\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.411584 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfh4p\" (UniqueName: \"kubernetes.io/projected/0e55c646-e887-492e-b27d-39536b38b245-kube-api-access-gfh4p\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.512713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-catalog-content\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.512844 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-utilities\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.512876 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfh4p\" (UniqueName: \"kubernetes.io/projected/0e55c646-e887-492e-b27d-39536b38b245-kube-api-access-gfh4p\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.513430 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-catalog-content\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.513563 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-utilities\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.531831 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfh4p\" (UniqueName: \"kubernetes.io/projected/0e55c646-e887-492e-b27d-39536b38b245-kube-api-access-gfh4p\") pod \"certified-operators-8lq7p\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:09 crc kubenswrapper[4772]: I0127 16:00:09.691732 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:10 crc kubenswrapper[4772]: I0127 16:00:10.164367 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8lq7p"] Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.173313 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8d9vw"] Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.177439 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.177782 4772 generic.go:334] "Generic (PLEG): container finished" podID="0e55c646-e887-492e-b27d-39536b38b245" containerID="5f1d8fdc7aed07858ca3363c0144e81cdc92e3922938515a33edf543ec95389e" exitCode=0 Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.177829 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerDied","Data":"5f1d8fdc7aed07858ca3363c0144e81cdc92e3922938515a33edf543ec95389e"} Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.177856 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerStarted","Data":"215607eb1ef46147418a0a5c60a763e046da6f15ce0cdc380aa944d9788c12db"} Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.182602 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.192826 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8d9vw"] Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.340980 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxvx\" (UniqueName: \"kubernetes.io/projected/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-kube-api-access-cpxvx\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.341063 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-utilities\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.341104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-catalog-content\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.442778 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxvx\" (UniqueName: \"kubernetes.io/projected/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-kube-api-access-cpxvx\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.442868 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-utilities\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.442904 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-catalog-content\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.443434 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-catalog-content\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.443752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-utilities\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.468427 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxvx\" (UniqueName: \"kubernetes.io/projected/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-kube-api-access-cpxvx\") pod \"community-operators-8d9vw\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.510577 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:11 crc kubenswrapper[4772]: I0127 16:00:11.842034 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8d9vw"] Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.174687 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l26xs"] Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.176819 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.183343 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l26xs"] Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.187386 4772 generic.go:334] "Generic (PLEG): container finished" podID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerID="eb2b24b6937f9d97b2ef4c64dd3a412a15206c847cf6c127667af5183957718b" exitCode=0 Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.187940 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerDied","Data":"eb2b24b6937f9d97b2ef4c64dd3a412a15206c847cf6c127667af5183957718b"} Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.188010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerStarted","Data":"2fd0ace984bed3e0eb564ba4cf8034119c99f0f4a375915de39fff1909a1dfe1"} Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.192044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerStarted","Data":"9ff88942bb2ffd8b58cb3374157dc07b383481f7c4b4e26e88426fd9c218bfac"} Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.261814 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-utilities\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.261927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjcj\" (UniqueName: \"kubernetes.io/projected/52988c66-42d0-4aa5-a46d-d966ac84eb6c-kube-api-access-nvjcj\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.262050 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-catalog-content\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.362751 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-catalog-content\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.362817 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-utilities\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.362858 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjcj\" (UniqueName: \"kubernetes.io/projected/52988c66-42d0-4aa5-a46d-d966ac84eb6c-kube-api-access-nvjcj\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.363606 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-catalog-content\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.363838 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-utilities\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.398887 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjcj\" (UniqueName: \"kubernetes.io/projected/52988c66-42d0-4aa5-a46d-d966ac84eb6c-kube-api-access-nvjcj\") pod \"redhat-marketplace-l26xs\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.496668 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:12 crc kubenswrapper[4772]: I0127 16:00:12.718944 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l26xs"] Jan 27 16:00:13 crc kubenswrapper[4772]: I0127 16:00:13.199261 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerStarted","Data":"c0b6e0a68c3dc10ef4d1ff53c5444a6a40a8fa4bff91ff29c1779550fd4bf956"} Jan 27 16:00:13 crc kubenswrapper[4772]: I0127 16:00:13.200464 4772 generic.go:334] "Generic (PLEG): container finished" podID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerID="e7fc82eac5cd511389005799fa7a2100c759a473c11e1b04bce834caa53ea989" exitCode=0 Jan 27 16:00:13 crc kubenswrapper[4772]: I0127 16:00:13.200521 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerDied","Data":"e7fc82eac5cd511389005799fa7a2100c759a473c11e1b04bce834caa53ea989"} Jan 27 16:00:13 crc kubenswrapper[4772]: I0127 16:00:13.200549 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerStarted","Data":"1b80836c83d3067947b0be4ac5034b810cd9fa2107cf4663070bd74214922382"} Jan 27 16:00:13 crc kubenswrapper[4772]: I0127 16:00:13.201939 4772 generic.go:334] "Generic (PLEG): container finished" podID="0e55c646-e887-492e-b27d-39536b38b245" containerID="9ff88942bb2ffd8b58cb3374157dc07b383481f7c4b4e26e88426fd9c218bfac" exitCode=0 Jan 27 16:00:13 crc kubenswrapper[4772]: I0127 16:00:13.201967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerDied","Data":"9ff88942bb2ffd8b58cb3374157dc07b383481f7c4b4e26e88426fd9c218bfac"} Jan 27 16:00:14 crc kubenswrapper[4772]: I0127 16:00:14.209842 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerStarted","Data":"efda54e04a3b9158207e5837fd7a6528bdd8fc43bf202745462e3bbb8c32bdb2"} Jan 27 16:00:14 crc kubenswrapper[4772]: I0127 16:00:14.212180 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerStarted","Data":"0e58bb490648f3a839abadeb3266c1fb8ad0a72f82a439d1c989dbde19289df2"} Jan 27 16:00:14 crc kubenswrapper[4772]: I0127 16:00:14.214591 4772 generic.go:334] "Generic (PLEG): container finished" podID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerID="c0b6e0a68c3dc10ef4d1ff53c5444a6a40a8fa4bff91ff29c1779550fd4bf956" exitCode=0 Jan 27 16:00:14 crc kubenswrapper[4772]: I0127 16:00:14.214630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerDied","Data":"c0b6e0a68c3dc10ef4d1ff53c5444a6a40a8fa4bff91ff29c1779550fd4bf956"} Jan 27 16:00:14 crc kubenswrapper[4772]: I0127 16:00:14.252110 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8lq7p" podStartSLOduration=2.661111567 podStartE2EDuration="5.252089884s" podCreationTimestamp="2026-01-27 16:00:09 +0000 UTC" firstStartedPulling="2026-01-27 16:00:11.182282904 +0000 UTC m=+3197.162892002" lastFinishedPulling="2026-01-27 16:00:13.773261221 +0000 UTC m=+3199.753870319" observedRunningTime="2026-01-27 16:00:14.249905951 +0000 UTC m=+3200.230515049" watchObservedRunningTime="2026-01-27 16:00:14.252089884 +0000 UTC m=+3200.232698982" Jan 27 16:00:15 crc kubenswrapper[4772]: I0127 16:00:15.232594 4772 generic.go:334] "Generic (PLEG): container finished" podID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerID="efda54e04a3b9158207e5837fd7a6528bdd8fc43bf202745462e3bbb8c32bdb2" exitCode=0 Jan 27 16:00:15 crc kubenswrapper[4772]: I0127 16:00:15.232662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerDied","Data":"efda54e04a3b9158207e5837fd7a6528bdd8fc43bf202745462e3bbb8c32bdb2"} Jan 27 16:00:15 crc kubenswrapper[4772]: I0127 16:00:15.238156 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerStarted","Data":"70571922f32975b959e9d6c31230572ab940e3dd080afcd55f53dd0e4c7d8039"} Jan 27 16:00:15 crc kubenswrapper[4772]: I0127 16:00:15.274022 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8d9vw" podStartSLOduration=1.844111055 podStartE2EDuration="4.274000154s" podCreationTimestamp="2026-01-27 16:00:11 +0000 UTC" firstStartedPulling="2026-01-27 16:00:12.190458679 +0000 UTC m=+3198.171067777" lastFinishedPulling="2026-01-27 16:00:14.620347778 +0000 UTC m=+3200.600956876" observedRunningTime="2026-01-27 16:00:15.272259634 +0000 UTC m=+3201.252868752" watchObservedRunningTime="2026-01-27 16:00:15.274000154 +0000 UTC m=+3201.254609252" Jan 27 16:00:16 crc kubenswrapper[4772]: I0127 16:00:16.245362 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerStarted","Data":"dfe7a2bf50079c2e751ce5574fccab4a5353b71f051b83c79197ffe548a713a9"} Jan 27 16:00:16 crc kubenswrapper[4772]: I0127 16:00:16.264406 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l26xs" podStartSLOduration=1.725218721 podStartE2EDuration="4.264387719s" podCreationTimestamp="2026-01-27 16:00:12 +0000 UTC" firstStartedPulling="2026-01-27 16:00:13.201487228 +0000 UTC m=+3199.182096326" lastFinishedPulling="2026-01-27 16:00:15.740656226 +0000 UTC m=+3201.721265324" observedRunningTime="2026-01-27 16:00:16.262750542 +0000 UTC m=+3202.243359640" watchObservedRunningTime="2026-01-27 16:00:16.264387719 +0000 UTC m=+3202.244996817" Jan 27 16:00:19 crc kubenswrapper[4772]: I0127 16:00:19.692520 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:19 crc kubenswrapper[4772]: I0127 16:00:19.692908 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:19 crc kubenswrapper[4772]: I0127 16:00:19.744948 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:20 crc kubenswrapper[4772]: I0127 16:00:20.314905 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:20 crc kubenswrapper[4772]: I0127 16:00:20.662583 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 16:00:21 crc kubenswrapper[4772]: I0127 16:00:21.511366 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:21 crc kubenswrapper[4772]: I0127 16:00:21.511413 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:21 crc kubenswrapper[4772]: I0127 16:00:21.556934 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.173384 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8lq7p"] Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.367056 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8lq7p" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="registry-server" containerID="cri-o://0e58bb490648f3a839abadeb3266c1fb8ad0a72f82a439d1c989dbde19289df2" gracePeriod=2 Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.367435 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"21ef3f853d795985962b240174bbec1611d9f0f58af15b07556fa41617b20592"} Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.481515 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.498455 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.499241 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:22 crc kubenswrapper[4772]: I0127 16:00:22.552876 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:23 crc kubenswrapper[4772]: I0127 16:00:23.425466 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:23 crc kubenswrapper[4772]: I0127 16:00:23.957078 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8d9vw"] Jan 27 16:00:24 crc kubenswrapper[4772]: I0127 16:00:24.386188 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8d9vw" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="registry-server" containerID="cri-o://70571922f32975b959e9d6c31230572ab940e3dd080afcd55f53dd0e4c7d8039" gracePeriod=2 Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.395506 4772 generic.go:334] "Generic (PLEG): container finished" podID="0e55c646-e887-492e-b27d-39536b38b245" containerID="0e58bb490648f3a839abadeb3266c1fb8ad0a72f82a439d1c989dbde19289df2" exitCode=0 Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.396514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerDied","Data":"0e58bb490648f3a839abadeb3266c1fb8ad0a72f82a439d1c989dbde19289df2"} Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.860507 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.987279 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfh4p\" (UniqueName: \"kubernetes.io/projected/0e55c646-e887-492e-b27d-39536b38b245-kube-api-access-gfh4p\") pod \"0e55c646-e887-492e-b27d-39536b38b245\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.987344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-utilities\") pod \"0e55c646-e887-492e-b27d-39536b38b245\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.987415 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-catalog-content\") pod \"0e55c646-e887-492e-b27d-39536b38b245\" (UID: \"0e55c646-e887-492e-b27d-39536b38b245\") " Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.988382 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-utilities" (OuterVolumeSpecName: "utilities") pod "0e55c646-e887-492e-b27d-39536b38b245" (UID: "0e55c646-e887-492e-b27d-39536b38b245"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:00:25 crc kubenswrapper[4772]: I0127 16:00:25.996693 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e55c646-e887-492e-b27d-39536b38b245-kube-api-access-gfh4p" (OuterVolumeSpecName: "kube-api-access-gfh4p") pod "0e55c646-e887-492e-b27d-39536b38b245" (UID: "0e55c646-e887-492e-b27d-39536b38b245"). InnerVolumeSpecName "kube-api-access-gfh4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.036278 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e55c646-e887-492e-b27d-39536b38b245" (UID: "0e55c646-e887-492e-b27d-39536b38b245"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.088807 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfh4p\" (UniqueName: \"kubernetes.io/projected/0e55c646-e887-492e-b27d-39536b38b245-kube-api-access-gfh4p\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.088871 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.088891 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e55c646-e887-492e-b27d-39536b38b245-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.356555 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l26xs"] Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.405735 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8lq7p" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.405729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8lq7p" event={"ID":"0e55c646-e887-492e-b27d-39536b38b245","Type":"ContainerDied","Data":"215607eb1ef46147418a0a5c60a763e046da6f15ce0cdc380aa944d9788c12db"} Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.405895 4772 scope.go:117] "RemoveContainer" containerID="0e58bb490648f3a839abadeb3266c1fb8ad0a72f82a439d1c989dbde19289df2" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.414316 4772 generic.go:334] "Generic (PLEG): container finished" podID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerID="70571922f32975b959e9d6c31230572ab940e3dd080afcd55f53dd0e4c7d8039" exitCode=0 Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.414437 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerDied","Data":"70571922f32975b959e9d6c31230572ab940e3dd080afcd55f53dd0e4c7d8039"} Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.414549 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l26xs" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="registry-server" containerID="cri-o://dfe7a2bf50079c2e751ce5574fccab4a5353b71f051b83c79197ffe548a713a9" gracePeriod=2 Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.423922 4772 scope.go:117] "RemoveContainer" containerID="9ff88942bb2ffd8b58cb3374157dc07b383481f7c4b4e26e88426fd9c218bfac" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.444345 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8lq7p"] Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.453080 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8lq7p"] Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.467687 4772 scope.go:117] "RemoveContainer" containerID="5f1d8fdc7aed07858ca3363c0144e81cdc92e3922938515a33edf543ec95389e" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.651459 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.678631 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e55c646-e887-492e-b27d-39536b38b245" path="/var/lib/kubelet/pods/0e55c646-e887-492e-b27d-39536b38b245/volumes" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.805955 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-utilities\") pod \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.805996 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-catalog-content\") pod \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.806096 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpxvx\" (UniqueName: \"kubernetes.io/projected/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-kube-api-access-cpxvx\") pod \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\" (UID: \"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d\") " Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.806886 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-utilities" (OuterVolumeSpecName: "utilities") pod "a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" (UID: "a4dcc32f-d75b-420b-8dc1-c1d45ce1390d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.807557 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.809413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-kube-api-access-cpxvx" (OuterVolumeSpecName: "kube-api-access-cpxvx") pod "a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" (UID: "a4dcc32f-d75b-420b-8dc1-c1d45ce1390d"). InnerVolumeSpecName "kube-api-access-cpxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.857775 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" (UID: "a4dcc32f-d75b-420b-8dc1-c1d45ce1390d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.909065 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpxvx\" (UniqueName: \"kubernetes.io/projected/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-kube-api-access-cpxvx\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:26 crc kubenswrapper[4772]: I0127 16:00:26.909100 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.447414 4772 generic.go:334] "Generic (PLEG): container finished" podID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerID="dfe7a2bf50079c2e751ce5574fccab4a5353b71f051b83c79197ffe548a713a9" exitCode=0 Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.447484 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerDied","Data":"dfe7a2bf50079c2e751ce5574fccab4a5353b71f051b83c79197ffe548a713a9"} Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.484876 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8d9vw" event={"ID":"a4dcc32f-d75b-420b-8dc1-c1d45ce1390d","Type":"ContainerDied","Data":"2fd0ace984bed3e0eb564ba4cf8034119c99f0f4a375915de39fff1909a1dfe1"} Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.484927 4772 scope.go:117] "RemoveContainer" containerID="70571922f32975b959e9d6c31230572ab940e3dd080afcd55f53dd0e4c7d8039" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.485057 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8d9vw" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.535454 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8d9vw"] Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.543881 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8d9vw"] Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.544327 4772 scope.go:117] "RemoveContainer" containerID="c0b6e0a68c3dc10ef4d1ff53c5444a6a40a8fa4bff91ff29c1779550fd4bf956" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.583061 4772 scope.go:117] "RemoveContainer" containerID="eb2b24b6937f9d97b2ef4c64dd3a412a15206c847cf6c127667af5183957718b" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.648884 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.750970 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-utilities\") pod \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.751049 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-catalog-content\") pod \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.751085 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvjcj\" (UniqueName: \"kubernetes.io/projected/52988c66-42d0-4aa5-a46d-d966ac84eb6c-kube-api-access-nvjcj\") pod \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\" (UID: \"52988c66-42d0-4aa5-a46d-d966ac84eb6c\") " Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.752378 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-utilities" (OuterVolumeSpecName: "utilities") pod "52988c66-42d0-4aa5-a46d-d966ac84eb6c" (UID: "52988c66-42d0-4aa5-a46d-d966ac84eb6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.755034 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52988c66-42d0-4aa5-a46d-d966ac84eb6c-kube-api-access-nvjcj" (OuterVolumeSpecName: "kube-api-access-nvjcj") pod "52988c66-42d0-4aa5-a46d-d966ac84eb6c" (UID: "52988c66-42d0-4aa5-a46d-d966ac84eb6c"). InnerVolumeSpecName "kube-api-access-nvjcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.774300 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52988c66-42d0-4aa5-a46d-d966ac84eb6c" (UID: "52988c66-42d0-4aa5-a46d-d966ac84eb6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.852942 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.852981 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvjcj\" (UniqueName: \"kubernetes.io/projected/52988c66-42d0-4aa5-a46d-d966ac84eb6c-kube-api-access-nvjcj\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:27 crc kubenswrapper[4772]: I0127 16:00:27.852992 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52988c66-42d0-4aa5-a46d-d966ac84eb6c-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.494953 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l26xs" event={"ID":"52988c66-42d0-4aa5-a46d-d966ac84eb6c","Type":"ContainerDied","Data":"1b80836c83d3067947b0be4ac5034b810cd9fa2107cf4663070bd74214922382"} Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.494965 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l26xs" Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.495438 4772 scope.go:117] "RemoveContainer" containerID="dfe7a2bf50079c2e751ce5574fccab4a5353b71f051b83c79197ffe548a713a9" Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.511681 4772 scope.go:117] "RemoveContainer" containerID="efda54e04a3b9158207e5837fd7a6528bdd8fc43bf202745462e3bbb8c32bdb2" Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.527237 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l26xs"] Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.533519 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l26xs"] Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.550493 4772 scope.go:117] "RemoveContainer" containerID="e7fc82eac5cd511389005799fa7a2100c759a473c11e1b04bce834caa53ea989" Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.673672 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" path="/var/lib/kubelet/pods/52988c66-42d0-4aa5-a46d-d966ac84eb6c/volumes" Jan 27 16:00:28 crc kubenswrapper[4772]: I0127 16:00:28.675668 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" path="/var/lib/kubelet/pods/a4dcc32f-d75b-420b-8dc1-c1d45ce1390d/volumes" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.551050 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tz78w"] Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555688 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555723 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555748 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="extract-content" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555757 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="extract-content" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555775 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555783 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555802 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="extract-utilities" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555810 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="extract-utilities" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555855 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555863 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555878 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="extract-content" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555886 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="extract-content" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555907 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="extract-content" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555915 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="extract-content" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555928 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="extract-utilities" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555941 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="extract-utilities" Jan 27 16:01:03 crc kubenswrapper[4772]: E0127 16:01:03.555953 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="extract-utilities" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.555962 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="extract-utilities" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.556525 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4dcc32f-d75b-420b-8dc1-c1d45ce1390d" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.556563 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e55c646-e887-492e-b27d-39536b38b245" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.556590 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="52988c66-42d0-4aa5-a46d-d966ac84eb6c" containerName="registry-server" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.559209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.573552 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tz78w"] Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.674079 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-catalog-content\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.674436 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-utilities\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.674546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8nn\" (UniqueName: \"kubernetes.io/projected/3cd0f6c5-43f1-4120-841d-76e540249886-kube-api-access-xg8nn\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.775865 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-utilities\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.776212 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8nn\" (UniqueName: \"kubernetes.io/projected/3cd0f6c5-43f1-4120-841d-76e540249886-kube-api-access-xg8nn\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.776426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-utilities\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.776553 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-catalog-content\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.776821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-catalog-content\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.797014 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8nn\" (UniqueName: \"kubernetes.io/projected/3cd0f6c5-43f1-4120-841d-76e540249886-kube-api-access-xg8nn\") pod \"redhat-operators-tz78w\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:03 crc kubenswrapper[4772]: I0127 16:01:03.893136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:04 crc kubenswrapper[4772]: I0127 16:01:04.374470 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tz78w"] Jan 27 16:01:04 crc kubenswrapper[4772]: I0127 16:01:04.755364 4772 generic.go:334] "Generic (PLEG): container finished" podID="3cd0f6c5-43f1-4120-841d-76e540249886" containerID="5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02" exitCode=0 Jan 27 16:01:04 crc kubenswrapper[4772]: I0127 16:01:04.755633 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerDied","Data":"5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02"} Jan 27 16:01:04 crc kubenswrapper[4772]: I0127 16:01:04.755660 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerStarted","Data":"61dc4f44513fd78fc78365968bdd4024b29d704ed02e0432d82b9f4fbc0894cd"} Jan 27 16:01:06 crc kubenswrapper[4772]: I0127 16:01:06.770798 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerStarted","Data":"d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f"} Jan 27 16:01:07 crc kubenswrapper[4772]: I0127 16:01:07.794314 4772 generic.go:334] "Generic (PLEG): container finished" podID="3cd0f6c5-43f1-4120-841d-76e540249886" containerID="d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f" exitCode=0 Jan 27 16:01:07 crc kubenswrapper[4772]: I0127 16:01:07.794406 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerDied","Data":"d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f"} Jan 27 16:01:08 crc kubenswrapper[4772]: I0127 16:01:08.803414 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerStarted","Data":"f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc"} Jan 27 16:01:13 crc kubenswrapper[4772]: I0127 16:01:13.893941 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:13 crc kubenswrapper[4772]: I0127 16:01:13.894479 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:13 crc kubenswrapper[4772]: I0127 16:01:13.936273 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:13 crc kubenswrapper[4772]: I0127 16:01:13.962428 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tz78w" podStartSLOduration=7.533237794 podStartE2EDuration="10.962396342s" podCreationTimestamp="2026-01-27 16:01:03 +0000 UTC" firstStartedPulling="2026-01-27 16:01:04.757229925 +0000 UTC m=+3250.737839023" lastFinishedPulling="2026-01-27 16:01:08.186388453 +0000 UTC m=+3254.166997571" observedRunningTime="2026-01-27 16:01:08.83103225 +0000 UTC m=+3254.811641358" watchObservedRunningTime="2026-01-27 16:01:13.962396342 +0000 UTC m=+3259.943005440" Jan 27 16:01:14 crc kubenswrapper[4772]: I0127 16:01:14.891015 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:14 crc kubenswrapper[4772]: I0127 16:01:14.932318 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tz78w"] Jan 27 16:01:16 crc kubenswrapper[4772]: I0127 16:01:16.850589 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tz78w" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="registry-server" containerID="cri-o://f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc" gracePeriod=2 Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.578232 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.692189 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-utilities\") pod \"3cd0f6c5-43f1-4120-841d-76e540249886\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.692565 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-catalog-content\") pod \"3cd0f6c5-43f1-4120-841d-76e540249886\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.692598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg8nn\" (UniqueName: \"kubernetes.io/projected/3cd0f6c5-43f1-4120-841d-76e540249886-kube-api-access-xg8nn\") pod \"3cd0f6c5-43f1-4120-841d-76e540249886\" (UID: \"3cd0f6c5-43f1-4120-841d-76e540249886\") " Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.693294 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-utilities" (OuterVolumeSpecName: "utilities") pod "3cd0f6c5-43f1-4120-841d-76e540249886" (UID: "3cd0f6c5-43f1-4120-841d-76e540249886"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.698480 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd0f6c5-43f1-4120-841d-76e540249886-kube-api-access-xg8nn" (OuterVolumeSpecName: "kube-api-access-xg8nn") pod "3cd0f6c5-43f1-4120-841d-76e540249886" (UID: "3cd0f6c5-43f1-4120-841d-76e540249886"). InnerVolumeSpecName "kube-api-access-xg8nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.794402 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg8nn\" (UniqueName: \"kubernetes.io/projected/3cd0f6c5-43f1-4120-841d-76e540249886-kube-api-access-xg8nn\") on node \"crc\" DevicePath \"\"" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.794438 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.823016 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cd0f6c5-43f1-4120-841d-76e540249886" (UID: "3cd0f6c5-43f1-4120-841d-76e540249886"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.865961 4772 generic.go:334] "Generic (PLEG): container finished" podID="3cd0f6c5-43f1-4120-841d-76e540249886" containerID="f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc" exitCode=0 Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.866011 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerDied","Data":"f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc"} Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.866042 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tz78w" event={"ID":"3cd0f6c5-43f1-4120-841d-76e540249886","Type":"ContainerDied","Data":"61dc4f44513fd78fc78365968bdd4024b29d704ed02e0432d82b9f4fbc0894cd"} Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.866057 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tz78w" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.866066 4772 scope.go:117] "RemoveContainer" containerID="f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.895836 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cd0f6c5-43f1-4120-841d-76e540249886-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.900309 4772 scope.go:117] "RemoveContainer" containerID="d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.929544 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tz78w"] Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.932581 4772 scope.go:117] "RemoveContainer" containerID="5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.946907 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tz78w"] Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.986364 4772 scope.go:117] "RemoveContainer" containerID="f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc" Jan 27 16:01:18 crc kubenswrapper[4772]: E0127 16:01:18.990098 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc\": container with ID starting with f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc not found: ID does not exist" containerID="f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.990152 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc"} err="failed to get container status \"f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc\": rpc error: code = NotFound desc = could not find container \"f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc\": container with ID starting with f6f9a4e35dfde3b6f98ddcc01331fdfec4af28b5b9e64b6ca58bee7f89fbaddc not found: ID does not exist" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.990200 4772 scope.go:117] "RemoveContainer" containerID="d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f" Jan 27 16:01:18 crc kubenswrapper[4772]: E0127 16:01:18.997214 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f\": container with ID starting with d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f not found: ID does not exist" containerID="d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.997266 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f"} err="failed to get container status \"d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f\": rpc error: code = NotFound desc = could not find container \"d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f\": container with ID starting with d626ad0bd0e93bf253da87fff3d7e5845272782b38c1fe8f821ce4c09122879f not found: ID does not exist" Jan 27 16:01:18 crc kubenswrapper[4772]: I0127 16:01:18.997298 4772 scope.go:117] "RemoveContainer" containerID="5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02" Jan 27 16:01:19 crc kubenswrapper[4772]: E0127 16:01:19.002447 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02\": container with ID starting with 5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02 not found: ID does not exist" containerID="5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02" Jan 27 16:01:19 crc kubenswrapper[4772]: I0127 16:01:19.002501 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02"} err="failed to get container status \"5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02\": rpc error: code = NotFound desc = could not find container \"5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02\": container with ID starting with 5c076ec7e9f7ce861de2aeb2d2ce1407d2581066fe8432d268d48aa28c136d02 not found: ID does not exist" Jan 27 16:01:20 crc kubenswrapper[4772]: I0127 16:01:20.672507 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" path="/var/lib/kubelet/pods/3cd0f6c5-43f1-4120-841d-76e540249886/volumes" Jan 27 16:02:42 crc kubenswrapper[4772]: I0127 16:02:42.058695 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:02:42 crc kubenswrapper[4772]: I0127 16:02:42.059286 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:03:12 crc kubenswrapper[4772]: I0127 16:03:12.058256 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:03:12 crc kubenswrapper[4772]: I0127 16:03:12.058874 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.058556 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.059312 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.059397 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.060126 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21ef3f853d795985962b240174bbec1611d9f0f58af15b07556fa41617b20592"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.060240 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://21ef3f853d795985962b240174bbec1611d9f0f58af15b07556fa41617b20592" gracePeriod=600 Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.845843 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="21ef3f853d795985962b240174bbec1611d9f0f58af15b07556fa41617b20592" exitCode=0 Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.845954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"21ef3f853d795985962b240174bbec1611d9f0f58af15b07556fa41617b20592"} Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.846321 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28"} Jan 27 16:03:42 crc kubenswrapper[4772]: I0127 16:03:42.846363 4772 scope.go:117] "RemoveContainer" containerID="bf80c85ff055e5b66481b1fb0c03a4a19bc2dadb96e8c295e0086beb0bb97a51" Jan 27 16:05:42 crc kubenswrapper[4772]: I0127 16:05:42.059162 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:05:42 crc kubenswrapper[4772]: I0127 16:05:42.060059 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:06:12 crc kubenswrapper[4772]: I0127 16:06:12.058524 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:06:12 crc kubenswrapper[4772]: I0127 16:06:12.059398 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.059401 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.060378 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.060432 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.061244 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.061333 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" gracePeriod=600 Jan 27 16:06:42 crc kubenswrapper[4772]: E0127 16:06:42.194895 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.513959 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" exitCode=0 Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.514352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28"} Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.514522 4772 scope.go:117] "RemoveContainer" containerID="21ef3f853d795985962b240174bbec1611d9f0f58af15b07556fa41617b20592" Jan 27 16:06:42 crc kubenswrapper[4772]: I0127 16:06:42.515389 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:06:42 crc kubenswrapper[4772]: E0127 16:06:42.515936 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:06:55 crc kubenswrapper[4772]: I0127 16:06:55.662858 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:06:55 crc kubenswrapper[4772]: E0127 16:06:55.664669 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:07:10 crc kubenswrapper[4772]: I0127 16:07:10.663907 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:07:10 crc kubenswrapper[4772]: E0127 16:07:10.666051 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:07:21 crc kubenswrapper[4772]: I0127 16:07:21.663378 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:07:21 crc kubenswrapper[4772]: E0127 16:07:21.664153 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:07:33 crc kubenswrapper[4772]: I0127 16:07:33.663237 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:07:33 crc kubenswrapper[4772]: E0127 16:07:33.664123 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:07:45 crc kubenswrapper[4772]: I0127 16:07:45.662683 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:07:45 crc kubenswrapper[4772]: E0127 16:07:45.663503 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:08:00 crc kubenswrapper[4772]: I0127 16:08:00.664179 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:08:00 crc kubenswrapper[4772]: E0127 16:08:00.664892 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:08:11 crc kubenswrapper[4772]: I0127 16:08:11.663489 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:08:11 crc kubenswrapper[4772]: E0127 16:08:11.664469 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:08:22 crc kubenswrapper[4772]: I0127 16:08:22.662708 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:08:22 crc kubenswrapper[4772]: E0127 16:08:22.663400 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:08:33 crc kubenswrapper[4772]: I0127 16:08:33.663360 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:08:33 crc kubenswrapper[4772]: E0127 16:08:33.664434 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:08:45 crc kubenswrapper[4772]: I0127 16:08:45.663725 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:08:45 crc kubenswrapper[4772]: E0127 16:08:45.664663 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:08:56 crc kubenswrapper[4772]: I0127 16:08:56.665589 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:08:56 crc kubenswrapper[4772]: E0127 16:08:56.666917 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:09:10 crc kubenswrapper[4772]: I0127 16:09:10.663920 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:09:10 crc kubenswrapper[4772]: E0127 16:09:10.684547 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:09:25 crc kubenswrapper[4772]: I0127 16:09:25.663143 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:09:25 crc kubenswrapper[4772]: E0127 16:09:25.663938 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:09:37 crc kubenswrapper[4772]: I0127 16:09:37.662876 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:09:37 crc kubenswrapper[4772]: E0127 16:09:37.663694 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:09:51 crc kubenswrapper[4772]: I0127 16:09:51.663403 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:09:51 crc kubenswrapper[4772]: E0127 16:09:51.664290 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:10:02 crc kubenswrapper[4772]: I0127 16:10:02.663717 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:10:02 crc kubenswrapper[4772]: E0127 16:10:02.664594 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.096932 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z6dng"] Jan 27 16:10:14 crc kubenswrapper[4772]: E0127 16:10:14.102143 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="registry-server" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.102210 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="registry-server" Jan 27 16:10:14 crc kubenswrapper[4772]: E0127 16:10:14.102244 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="extract-utilities" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.102256 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="extract-utilities" Jan 27 16:10:14 crc kubenswrapper[4772]: E0127 16:10:14.102274 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="extract-content" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.102287 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="extract-content" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.102540 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd0f6c5-43f1-4120-841d-76e540249886" containerName="registry-server" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.104222 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.111779 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z6dng"] Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.164909 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-utilities\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.165025 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw26c\" (UniqueName: \"kubernetes.io/projected/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-kube-api-access-jw26c\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.165052 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-catalog-content\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.266157 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-utilities\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.266287 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw26c\" (UniqueName: \"kubernetes.io/projected/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-kube-api-access-jw26c\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.266309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-catalog-content\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.266774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-catalog-content\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.267214 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-utilities\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.290469 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw26c\" (UniqueName: \"kubernetes.io/projected/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-kube-api-access-jw26c\") pod \"certified-operators-z6dng\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.435454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:14 crc kubenswrapper[4772]: I0127 16:10:14.910775 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z6dng"] Jan 27 16:10:15 crc kubenswrapper[4772]: I0127 16:10:15.175196 4772 generic.go:334] "Generic (PLEG): container finished" podID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerID="35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f" exitCode=0 Jan 27 16:10:15 crc kubenswrapper[4772]: I0127 16:10:15.175266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerDied","Data":"35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f"} Jan 27 16:10:15 crc kubenswrapper[4772]: I0127 16:10:15.175532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerStarted","Data":"558e1f6a78cf7d4a5320f1cf0bc3c39c426c541b60e924bae60ed80904aad6b9"} Jan 27 16:10:15 crc kubenswrapper[4772]: I0127 16:10:15.177859 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:10:17 crc kubenswrapper[4772]: I0127 16:10:17.200080 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerStarted","Data":"6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62"} Jan 27 16:10:17 crc kubenswrapper[4772]: I0127 16:10:17.662524 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:10:17 crc kubenswrapper[4772]: E0127 16:10:17.662783 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:10:18 crc kubenswrapper[4772]: I0127 16:10:18.211946 4772 generic.go:334] "Generic (PLEG): container finished" podID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerID="6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62" exitCode=0 Jan 27 16:10:18 crc kubenswrapper[4772]: I0127 16:10:18.212026 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerDied","Data":"6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62"} Jan 27 16:10:19 crc kubenswrapper[4772]: I0127 16:10:19.219780 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerStarted","Data":"e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece"} Jan 27 16:10:19 crc kubenswrapper[4772]: I0127 16:10:19.243403 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z6dng" podStartSLOduration=1.810063081 podStartE2EDuration="5.243382351s" podCreationTimestamp="2026-01-27 16:10:14 +0000 UTC" firstStartedPulling="2026-01-27 16:10:15.177568788 +0000 UTC m=+3801.158177886" lastFinishedPulling="2026-01-27 16:10:18.610888058 +0000 UTC m=+3804.591497156" observedRunningTime="2026-01-27 16:10:19.235441942 +0000 UTC m=+3805.216051060" watchObservedRunningTime="2026-01-27 16:10:19.243382351 +0000 UTC m=+3805.223991459" Jan 27 16:10:24 crc kubenswrapper[4772]: I0127 16:10:24.436427 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:24 crc kubenswrapper[4772]: I0127 16:10:24.437326 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:24 crc kubenswrapper[4772]: I0127 16:10:24.517223 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:25 crc kubenswrapper[4772]: I0127 16:10:25.311218 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:25 crc kubenswrapper[4772]: I0127 16:10:25.358624 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z6dng"] Jan 27 16:10:27 crc kubenswrapper[4772]: I0127 16:10:27.275386 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z6dng" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="registry-server" containerID="cri-o://e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece" gracePeriod=2 Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.244432 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.285033 4772 generic.go:334] "Generic (PLEG): container finished" podID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerID="e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece" exitCode=0 Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.285072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerDied","Data":"e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece"} Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.285098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z6dng" event={"ID":"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e","Type":"ContainerDied","Data":"558e1f6a78cf7d4a5320f1cf0bc3c39c426c541b60e924bae60ed80904aad6b9"} Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.285114 4772 scope.go:117] "RemoveContainer" containerID="e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.285124 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z6dng" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.309700 4772 scope.go:117] "RemoveContainer" containerID="6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.324909 4772 scope.go:117] "RemoveContainer" containerID="35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.354644 4772 scope.go:117] "RemoveContainer" containerID="e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece" Jan 27 16:10:28 crc kubenswrapper[4772]: E0127 16:10:28.354976 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece\": container with ID starting with e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece not found: ID does not exist" containerID="e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.355016 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece"} err="failed to get container status \"e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece\": rpc error: code = NotFound desc = could not find container \"e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece\": container with ID starting with e0d8b213023473fef3cf7c744bfb81b72647509211c96ef3e3c4b874720f6ece not found: ID does not exist" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.355047 4772 scope.go:117] "RemoveContainer" containerID="6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62" Jan 27 16:10:28 crc kubenswrapper[4772]: E0127 16:10:28.355347 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62\": container with ID starting with 6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62 not found: ID does not exist" containerID="6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.355373 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62"} err="failed to get container status \"6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62\": rpc error: code = NotFound desc = could not find container \"6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62\": container with ID starting with 6aa400c6da0aa1e2baf4abe44b967b9a0669127281173590e101b0edb111df62 not found: ID does not exist" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.355392 4772 scope.go:117] "RemoveContainer" containerID="35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f" Jan 27 16:10:28 crc kubenswrapper[4772]: E0127 16:10:28.355657 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f\": container with ID starting with 35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f not found: ID does not exist" containerID="35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.355688 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f"} err="failed to get container status \"35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f\": rpc error: code = NotFound desc = could not find container \"35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f\": container with ID starting with 35168c6609a7407b4fc3f7bf925b654f6a0d8a97426e7a546461834bf8cb717f not found: ID does not exist" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.400230 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-catalog-content\") pod \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.400306 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-utilities\") pod \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.400342 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw26c\" (UniqueName: \"kubernetes.io/projected/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-kube-api-access-jw26c\") pod \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\" (UID: \"c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e\") " Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.401211 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-utilities" (OuterVolumeSpecName: "utilities") pod "c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" (UID: "c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.406227 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-kube-api-access-jw26c" (OuterVolumeSpecName: "kube-api-access-jw26c") pod "c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" (UID: "c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e"). InnerVolumeSpecName "kube-api-access-jw26c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.456633 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" (UID: "c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.502525 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.502583 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.502596 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw26c\" (UniqueName: \"kubernetes.io/projected/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e-kube-api-access-jw26c\") on node \"crc\" DevicePath \"\"" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.627078 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z6dng"] Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.639549 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z6dng"] Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.663409 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:10:28 crc kubenswrapper[4772]: E0127 16:10:28.663602 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:10:28 crc kubenswrapper[4772]: I0127 16:10:28.676709 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" path="/var/lib/kubelet/pods/c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e/volumes" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.812964 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c4gj5"] Jan 27 16:10:39 crc kubenswrapper[4772]: E0127 16:10:39.815239 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="extract-content" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.815375 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="extract-content" Jan 27 16:10:39 crc kubenswrapper[4772]: E0127 16:10:39.815454 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="extract-utilities" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.815536 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="extract-utilities" Jan 27 16:10:39 crc kubenswrapper[4772]: E0127 16:10:39.815740 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="registry-server" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.815823 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="registry-server" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.816074 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cc87d0-42e9-4591-8bd0-9d4614cfbb0e" containerName="registry-server" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.823354 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.830305 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4gj5"] Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.959736 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-catalog-content\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.959807 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jckq\" (UniqueName: \"kubernetes.io/projected/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-kube-api-access-2jckq\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:39 crc kubenswrapper[4772]: I0127 16:10:39.959853 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-utilities\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.060682 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-catalog-content\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.060737 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jckq\" (UniqueName: \"kubernetes.io/projected/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-kube-api-access-2jckq\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.060761 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-utilities\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.061243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-catalog-content\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.061327 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-utilities\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.079000 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jckq\" (UniqueName: \"kubernetes.io/projected/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-kube-api-access-2jckq\") pod \"redhat-marketplace-c4gj5\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.142153 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.604203 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4gj5"] Jan 27 16:10:40 crc kubenswrapper[4772]: I0127 16:10:40.664056 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:10:40 crc kubenswrapper[4772]: E0127 16:10:40.664325 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:10:41 crc kubenswrapper[4772]: I0127 16:10:41.379947 4772 generic.go:334] "Generic (PLEG): container finished" podID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerID="996c72dfdfa84fafb04e7cd89a50cece9a76678da4256459e90d7170571e1056" exitCode=0 Jan 27 16:10:41 crc kubenswrapper[4772]: I0127 16:10:41.380300 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerDied","Data":"996c72dfdfa84fafb04e7cd89a50cece9a76678da4256459e90d7170571e1056"} Jan 27 16:10:41 crc kubenswrapper[4772]: I0127 16:10:41.380334 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerStarted","Data":"492592aa862e27c3543cc1b061d716f233b3452f56f510b8b81b6ba4072a35ac"} Jan 27 16:10:42 crc kubenswrapper[4772]: I0127 16:10:42.388518 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerStarted","Data":"2d5124e3976d77405e8d798c2f159be25a70440ff4f07c0ba4f655e8adb5a3fe"} Jan 27 16:10:43 crc kubenswrapper[4772]: I0127 16:10:43.397462 4772 generic.go:334] "Generic (PLEG): container finished" podID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerID="2d5124e3976d77405e8d798c2f159be25a70440ff4f07c0ba4f655e8adb5a3fe" exitCode=0 Jan 27 16:10:43 crc kubenswrapper[4772]: I0127 16:10:43.397522 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerDied","Data":"2d5124e3976d77405e8d798c2f159be25a70440ff4f07c0ba4f655e8adb5a3fe"} Jan 27 16:10:44 crc kubenswrapper[4772]: I0127 16:10:44.407055 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerStarted","Data":"1af866d659fa58e50a0610b6b8c8e12def86017e3a2f8f96a984121bf650c258"} Jan 27 16:10:44 crc kubenswrapper[4772]: I0127 16:10:44.441789 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c4gj5" podStartSLOduration=2.816111893 podStartE2EDuration="5.441754435s" podCreationTimestamp="2026-01-27 16:10:39 +0000 UTC" firstStartedPulling="2026-01-27 16:10:41.383180199 +0000 UTC m=+3827.363789297" lastFinishedPulling="2026-01-27 16:10:44.008822751 +0000 UTC m=+3829.989431839" observedRunningTime="2026-01-27 16:10:44.432018074 +0000 UTC m=+3830.412627192" watchObservedRunningTime="2026-01-27 16:10:44.441754435 +0000 UTC m=+3830.422363583" Jan 27 16:10:50 crc kubenswrapper[4772]: I0127 16:10:50.142375 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:50 crc kubenswrapper[4772]: I0127 16:10:50.142890 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:50 crc kubenswrapper[4772]: I0127 16:10:50.179965 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:50 crc kubenswrapper[4772]: I0127 16:10:50.506490 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:50 crc kubenswrapper[4772]: I0127 16:10:50.563064 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4gj5"] Jan 27 16:10:52 crc kubenswrapper[4772]: I0127 16:10:52.459435 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c4gj5" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="registry-server" containerID="cri-o://1af866d659fa58e50a0610b6b8c8e12def86017e3a2f8f96a984121bf650c258" gracePeriod=2 Jan 27 16:10:52 crc kubenswrapper[4772]: I0127 16:10:52.663303 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:10:52 crc kubenswrapper[4772]: E0127 16:10:52.663531 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:10:53 crc kubenswrapper[4772]: I0127 16:10:53.471836 4772 generic.go:334] "Generic (PLEG): container finished" podID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerID="1af866d659fa58e50a0610b6b8c8e12def86017e3a2f8f96a984121bf650c258" exitCode=0 Jan 27 16:10:53 crc kubenswrapper[4772]: I0127 16:10:53.471879 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerDied","Data":"1af866d659fa58e50a0610b6b8c8e12def86017e3a2f8f96a984121bf650c258"} Jan 27 16:10:53 crc kubenswrapper[4772]: I0127 16:10:53.986808 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.161149 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-catalog-content\") pod \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.161278 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jckq\" (UniqueName: \"kubernetes.io/projected/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-kube-api-access-2jckq\") pod \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.161372 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-utilities\") pod \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\" (UID: \"2768246f-c1ba-4a6b-a591-3f2307bbb1ab\") " Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.162245 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-utilities" (OuterVolumeSpecName: "utilities") pod "2768246f-c1ba-4a6b-a591-3f2307bbb1ab" (UID: "2768246f-c1ba-4a6b-a591-3f2307bbb1ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.182790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2768246f-c1ba-4a6b-a591-3f2307bbb1ab" (UID: "2768246f-c1ba-4a6b-a591-3f2307bbb1ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.263549 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.263587 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.481206 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4gj5" event={"ID":"2768246f-c1ba-4a6b-a591-3f2307bbb1ab","Type":"ContainerDied","Data":"492592aa862e27c3543cc1b061d716f233b3452f56f510b8b81b6ba4072a35ac"} Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.481279 4772 scope.go:117] "RemoveContainer" containerID="1af866d659fa58e50a0610b6b8c8e12def86017e3a2f8f96a984121bf650c258" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.481314 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4gj5" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.485599 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-kube-api-access-2jckq" (OuterVolumeSpecName: "kube-api-access-2jckq") pod "2768246f-c1ba-4a6b-a591-3f2307bbb1ab" (UID: "2768246f-c1ba-4a6b-a591-3f2307bbb1ab"). InnerVolumeSpecName "kube-api-access-2jckq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.531737 4772 scope.go:117] "RemoveContainer" containerID="2d5124e3976d77405e8d798c2f159be25a70440ff4f07c0ba4f655e8adb5a3fe" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.547754 4772 scope.go:117] "RemoveContainer" containerID="996c72dfdfa84fafb04e7cd89a50cece9a76678da4256459e90d7170571e1056" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.567244 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jckq\" (UniqueName: \"kubernetes.io/projected/2768246f-c1ba-4a6b-a591-3f2307bbb1ab-kube-api-access-2jckq\") on node \"crc\" DevicePath \"\"" Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.824423 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4gj5"] Jan 27 16:10:54 crc kubenswrapper[4772]: I0127 16:10:54.825392 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4gj5"] Jan 27 16:10:56 crc kubenswrapper[4772]: I0127 16:10:56.675070 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" path="/var/lib/kubelet/pods/2768246f-c1ba-4a6b-a591-3f2307bbb1ab/volumes" Jan 27 16:11:05 crc kubenswrapper[4772]: I0127 16:11:05.663131 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:11:05 crc kubenswrapper[4772]: E0127 16:11:05.664038 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:11:18 crc kubenswrapper[4772]: I0127 16:11:18.663062 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:11:18 crc kubenswrapper[4772]: E0127 16:11:18.664053 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:11:29 crc kubenswrapper[4772]: I0127 16:11:29.663511 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:11:29 crc kubenswrapper[4772]: E0127 16:11:29.664591 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:11:44 crc kubenswrapper[4772]: I0127 16:11:44.666924 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:11:45 crc kubenswrapper[4772]: I0127 16:11:45.855464 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"50cc7379b98d80b676c41cd73458def5401d0a1c59e714228729e6bc1cefe905"} Jan 27 16:14:12 crc kubenswrapper[4772]: I0127 16:14:12.058331 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:14:12 crc kubenswrapper[4772]: I0127 16:14:12.058898 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:14:42 crc kubenswrapper[4772]: I0127 16:14:42.058585 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:14:42 crc kubenswrapper[4772]: I0127 16:14:42.059109 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.188437 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4"] Jan 27 16:15:00 crc kubenswrapper[4772]: E0127 16:15:00.189373 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="extract-content" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.189392 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="extract-content" Jan 27 16:15:00 crc kubenswrapper[4772]: E0127 16:15:00.189420 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="extract-utilities" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.189430 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="extract-utilities" Jan 27 16:15:00 crc kubenswrapper[4772]: E0127 16:15:00.189444 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="registry-server" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.189453 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="registry-server" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.189639 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2768246f-c1ba-4a6b-a591-3f2307bbb1ab" containerName="registry-server" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.190250 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.192308 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.193345 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.206771 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4"] Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.266691 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be908ee-6173-4ee8-80c4-0738697898d2-config-volume\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.266753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flppb\" (UniqueName: \"kubernetes.io/projected/0be908ee-6173-4ee8-80c4-0738697898d2-kube-api-access-flppb\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.266796 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be908ee-6173-4ee8-80c4-0738697898d2-secret-volume\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.368070 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be908ee-6173-4ee8-80c4-0738697898d2-config-volume\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.368184 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flppb\" (UniqueName: \"kubernetes.io/projected/0be908ee-6173-4ee8-80c4-0738697898d2-kube-api-access-flppb\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.368270 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be908ee-6173-4ee8-80c4-0738697898d2-secret-volume\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.369085 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be908ee-6173-4ee8-80c4-0738697898d2-config-volume\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.380028 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be908ee-6173-4ee8-80c4-0738697898d2-secret-volume\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.384416 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flppb\" (UniqueName: \"kubernetes.io/projected/0be908ee-6173-4ee8-80c4-0738697898d2-kube-api-access-flppb\") pod \"collect-profiles-29492175-fg6x4\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.507597 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:00 crc kubenswrapper[4772]: I0127 16:15:00.748796 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4"] Jan 27 16:15:00 crc kubenswrapper[4772]: W0127 16:15:00.751009 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be908ee_6173_4ee8_80c4_0738697898d2.slice/crio-9bbba9000225be6d4721ad0c912ef6a2b8989be7041107ccb5241f8ca0ed8398 WatchSource:0}: Error finding container 9bbba9000225be6d4721ad0c912ef6a2b8989be7041107ccb5241f8ca0ed8398: Status 404 returned error can't find the container with id 9bbba9000225be6d4721ad0c912ef6a2b8989be7041107ccb5241f8ca0ed8398 Jan 27 16:15:01 crc kubenswrapper[4772]: I0127 16:15:01.297823 4772 generic.go:334] "Generic (PLEG): container finished" podID="0be908ee-6173-4ee8-80c4-0738697898d2" containerID="d013dea461e279e8b861558e82f04a509da66ccae91eabf32103d04803eb33bd" exitCode=0 Jan 27 16:15:01 crc kubenswrapper[4772]: I0127 16:15:01.297867 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" event={"ID":"0be908ee-6173-4ee8-80c4-0738697898d2","Type":"ContainerDied","Data":"d013dea461e279e8b861558e82f04a509da66ccae91eabf32103d04803eb33bd"} Jan 27 16:15:01 crc kubenswrapper[4772]: I0127 16:15:01.297901 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" event={"ID":"0be908ee-6173-4ee8-80c4-0738697898d2","Type":"ContainerStarted","Data":"9bbba9000225be6d4721ad0c912ef6a2b8989be7041107ccb5241f8ca0ed8398"} Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.577490 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.698684 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be908ee-6173-4ee8-80c4-0738697898d2-config-volume\") pod \"0be908ee-6173-4ee8-80c4-0738697898d2\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.698851 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flppb\" (UniqueName: \"kubernetes.io/projected/0be908ee-6173-4ee8-80c4-0738697898d2-kube-api-access-flppb\") pod \"0be908ee-6173-4ee8-80c4-0738697898d2\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.698891 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be908ee-6173-4ee8-80c4-0738697898d2-secret-volume\") pod \"0be908ee-6173-4ee8-80c4-0738697898d2\" (UID: \"0be908ee-6173-4ee8-80c4-0738697898d2\") " Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.699568 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0be908ee-6173-4ee8-80c4-0738697898d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "0be908ee-6173-4ee8-80c4-0738697898d2" (UID: "0be908ee-6173-4ee8-80c4-0738697898d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.703766 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0be908ee-6173-4ee8-80c4-0738697898d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0be908ee-6173-4ee8-80c4-0738697898d2" (UID: "0be908ee-6173-4ee8-80c4-0738697898d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.704011 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be908ee-6173-4ee8-80c4-0738697898d2-kube-api-access-flppb" (OuterVolumeSpecName: "kube-api-access-flppb") pod "0be908ee-6173-4ee8-80c4-0738697898d2" (UID: "0be908ee-6173-4ee8-80c4-0738697898d2"). InnerVolumeSpecName "kube-api-access-flppb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.800325 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flppb\" (UniqueName: \"kubernetes.io/projected/0be908ee-6173-4ee8-80c4-0738697898d2-kube-api-access-flppb\") on node \"crc\" DevicePath \"\"" Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.800354 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0be908ee-6173-4ee8-80c4-0738697898d2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:15:02 crc kubenswrapper[4772]: I0127 16:15:02.800364 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0be908ee-6173-4ee8-80c4-0738697898d2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:15:03 crc kubenswrapper[4772]: I0127 16:15:03.314621 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" event={"ID":"0be908ee-6173-4ee8-80c4-0738697898d2","Type":"ContainerDied","Data":"9bbba9000225be6d4721ad0c912ef6a2b8989be7041107ccb5241f8ca0ed8398"} Jan 27 16:15:03 crc kubenswrapper[4772]: I0127 16:15:03.314922 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbba9000225be6d4721ad0c912ef6a2b8989be7041107ccb5241f8ca0ed8398" Jan 27 16:15:03 crc kubenswrapper[4772]: I0127 16:15:03.314697 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4" Jan 27 16:15:03 crc kubenswrapper[4772]: I0127 16:15:03.646178 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6"] Jan 27 16:15:03 crc kubenswrapper[4772]: I0127 16:15:03.653045 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492130-7lgz6"] Jan 27 16:15:04 crc kubenswrapper[4772]: I0127 16:15:04.674225 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d103a19-1490-433a-abdb-3ebd279265f5" path="/var/lib/kubelet/pods/6d103a19-1490-433a-abdb-3ebd279265f5/volumes" Jan 27 16:15:06 crc kubenswrapper[4772]: I0127 16:15:06.014859 4772 scope.go:117] "RemoveContainer" containerID="58e0f9aeee1bc53c7d023bfdbaa2444440ab205390cfd9df2a1973966a2ae19f" Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.058767 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.059417 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.059500 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.060155 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50cc7379b98d80b676c41cd73458def5401d0a1c59e714228729e6bc1cefe905"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.060235 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://50cc7379b98d80b676c41cd73458def5401d0a1c59e714228729e6bc1cefe905" gracePeriod=600 Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.371510 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="50cc7379b98d80b676c41cd73458def5401d0a1c59e714228729e6bc1cefe905" exitCode=0 Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.371737 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"50cc7379b98d80b676c41cd73458def5401d0a1c59e714228729e6bc1cefe905"} Jan 27 16:15:12 crc kubenswrapper[4772]: I0127 16:15:12.371916 4772 scope.go:117] "RemoveContainer" containerID="619886d7924cc8d7020cb14dc925242ea7fcb59d2ec01dae8b4e97b29bb44e28" Jan 27 16:15:13 crc kubenswrapper[4772]: I0127 16:15:13.382641 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d"} Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.000921 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-klr7d"] Jan 27 16:15:57 crc kubenswrapper[4772]: E0127 16:15:57.001834 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be908ee-6173-4ee8-80c4-0738697898d2" containerName="collect-profiles" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.001849 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be908ee-6173-4ee8-80c4-0738697898d2" containerName="collect-profiles" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.002027 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be908ee-6173-4ee8-80c4-0738697898d2" containerName="collect-profiles" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.003202 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.013862 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klr7d"] Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.178102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-utilities\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.178212 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-catalog-content\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.178245 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbqf\" (UniqueName: \"kubernetes.io/projected/0d53e726-f66e-4996-84fb-2f2547cadd29-kube-api-access-grbqf\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.279508 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-catalog-content\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.279572 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbqf\" (UniqueName: \"kubernetes.io/projected/0d53e726-f66e-4996-84fb-2f2547cadd29-kube-api-access-grbqf\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.279671 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-utilities\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.280117 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-catalog-content\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.280484 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-utilities\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.307076 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbqf\" (UniqueName: \"kubernetes.io/projected/0d53e726-f66e-4996-84fb-2f2547cadd29-kube-api-access-grbqf\") pod \"community-operators-klr7d\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.323907 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:15:57 crc kubenswrapper[4772]: I0127 16:15:57.840766 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klr7d"] Jan 27 16:15:58 crc kubenswrapper[4772]: I0127 16:15:58.723283 4772 generic.go:334] "Generic (PLEG): container finished" podID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerID="dc989b02724c66a718647f7d51ad4282c746eb41e28bfd955a90b8dfc944f809" exitCode=0 Jan 27 16:15:58 crc kubenswrapper[4772]: I0127 16:15:58.723344 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klr7d" event={"ID":"0d53e726-f66e-4996-84fb-2f2547cadd29","Type":"ContainerDied","Data":"dc989b02724c66a718647f7d51ad4282c746eb41e28bfd955a90b8dfc944f809"} Jan 27 16:15:58 crc kubenswrapper[4772]: I0127 16:15:58.723696 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klr7d" event={"ID":"0d53e726-f66e-4996-84fb-2f2547cadd29","Type":"ContainerStarted","Data":"dc0a19ec7b4587b2e1794721655f1f30ca61ab62d049253d7b1c0a202d8024ba"} Jan 27 16:15:58 crc kubenswrapper[4772]: I0127 16:15:58.725134 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:15:59 crc kubenswrapper[4772]: I0127 16:15:59.733613 4772 generic.go:334] "Generic (PLEG): container finished" podID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerID="30792775c2c579d6914fb2fed483b2e9a5316c264bffd3bdc2010be3ded311b9" exitCode=0 Jan 27 16:15:59 crc kubenswrapper[4772]: I0127 16:15:59.733670 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klr7d" event={"ID":"0d53e726-f66e-4996-84fb-2f2547cadd29","Type":"ContainerDied","Data":"30792775c2c579d6914fb2fed483b2e9a5316c264bffd3bdc2010be3ded311b9"} Jan 27 16:15:59 crc kubenswrapper[4772]: I0127 16:15:59.997278 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pqs7f"] Jan 27 16:15:59 crc kubenswrapper[4772]: I0127 16:15:59.998984 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.012951 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqs7f"] Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.016700 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgnx5\" (UniqueName: \"kubernetes.io/projected/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-kube-api-access-tgnx5\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.016757 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-catalog-content\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.016788 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-utilities\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.117726 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgnx5\" (UniqueName: \"kubernetes.io/projected/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-kube-api-access-tgnx5\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.117791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-catalog-content\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.117821 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-utilities\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.118336 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-utilities\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.118527 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-catalog-content\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.138302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgnx5\" (UniqueName: \"kubernetes.io/projected/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-kube-api-access-tgnx5\") pod \"redhat-operators-pqs7f\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.327956 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.742229 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klr7d" event={"ID":"0d53e726-f66e-4996-84fb-2f2547cadd29","Type":"ContainerStarted","Data":"28d4cd39be1138334db2b7783e62aa2ee0c778982c5eaf0a247d5529afdfc14d"} Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.754092 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqs7f"] Jan 27 16:16:00 crc kubenswrapper[4772]: W0127 16:16:00.765158 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4d3806b_ef7d_43c0_9c33_e08d0f8dcb63.slice/crio-a901b081ae362cf564826808ed535c7fccd3df2a57e808801afadcbe086a2b9b WatchSource:0}: Error finding container a901b081ae362cf564826808ed535c7fccd3df2a57e808801afadcbe086a2b9b: Status 404 returned error can't find the container with id a901b081ae362cf564826808ed535c7fccd3df2a57e808801afadcbe086a2b9b Jan 27 16:16:00 crc kubenswrapper[4772]: I0127 16:16:00.788957 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-klr7d" podStartSLOduration=3.370697252 podStartE2EDuration="4.788941905s" podCreationTimestamp="2026-01-27 16:15:56 +0000 UTC" firstStartedPulling="2026-01-27 16:15:58.724879859 +0000 UTC m=+4144.705488967" lastFinishedPulling="2026-01-27 16:16:00.143124522 +0000 UTC m=+4146.123733620" observedRunningTime="2026-01-27 16:16:00.786309829 +0000 UTC m=+4146.766918937" watchObservedRunningTime="2026-01-27 16:16:00.788941905 +0000 UTC m=+4146.769551003" Jan 27 16:16:01 crc kubenswrapper[4772]: I0127 16:16:01.750247 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerID="6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522" exitCode=0 Jan 27 16:16:01 crc kubenswrapper[4772]: I0127 16:16:01.751304 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerDied","Data":"6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522"} Jan 27 16:16:01 crc kubenswrapper[4772]: I0127 16:16:01.751365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerStarted","Data":"a901b081ae362cf564826808ed535c7fccd3df2a57e808801afadcbe086a2b9b"} Jan 27 16:16:02 crc kubenswrapper[4772]: I0127 16:16:02.760025 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerStarted","Data":"f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28"} Jan 27 16:16:03 crc kubenswrapper[4772]: I0127 16:16:03.771961 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerID="f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28" exitCode=0 Jan 27 16:16:03 crc kubenswrapper[4772]: I0127 16:16:03.772100 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerDied","Data":"f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28"} Jan 27 16:16:04 crc kubenswrapper[4772]: I0127 16:16:04.782836 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerStarted","Data":"3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf"} Jan 27 16:16:04 crc kubenswrapper[4772]: I0127 16:16:04.806599 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pqs7f" podStartSLOduration=3.232537149 podStartE2EDuration="5.806572403s" podCreationTimestamp="2026-01-27 16:15:59 +0000 UTC" firstStartedPulling="2026-01-27 16:16:01.75198679 +0000 UTC m=+4147.732595888" lastFinishedPulling="2026-01-27 16:16:04.326022044 +0000 UTC m=+4150.306631142" observedRunningTime="2026-01-27 16:16:04.800725635 +0000 UTC m=+4150.781334743" watchObservedRunningTime="2026-01-27 16:16:04.806572403 +0000 UTC m=+4150.787181501" Jan 27 16:16:07 crc kubenswrapper[4772]: I0127 16:16:07.324521 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:16:07 crc kubenswrapper[4772]: I0127 16:16:07.324855 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:16:07 crc kubenswrapper[4772]: I0127 16:16:07.365217 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:16:07 crc kubenswrapper[4772]: I0127 16:16:07.840896 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:16:08 crc kubenswrapper[4772]: I0127 16:16:08.793212 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-klr7d"] Jan 27 16:16:09 crc kubenswrapper[4772]: I0127 16:16:09.813463 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-klr7d" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="registry-server" containerID="cri-o://28d4cd39be1138334db2b7783e62aa2ee0c778982c5eaf0a247d5529afdfc14d" gracePeriod=2 Jan 27 16:16:10 crc kubenswrapper[4772]: I0127 16:16:10.329229 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:10 crc kubenswrapper[4772]: I0127 16:16:10.329293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:10 crc kubenswrapper[4772]: I0127 16:16:10.371953 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:10 crc kubenswrapper[4772]: I0127 16:16:10.836016 4772 generic.go:334] "Generic (PLEG): container finished" podID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerID="28d4cd39be1138334db2b7783e62aa2ee0c778982c5eaf0a247d5529afdfc14d" exitCode=0 Jan 27 16:16:10 crc kubenswrapper[4772]: I0127 16:16:10.836856 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klr7d" event={"ID":"0d53e726-f66e-4996-84fb-2f2547cadd29","Type":"ContainerDied","Data":"28d4cd39be1138334db2b7783e62aa2ee0c778982c5eaf0a247d5529afdfc14d"} Jan 27 16:16:10 crc kubenswrapper[4772]: I0127 16:16:10.886442 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.191967 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqs7f"] Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.437943 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.480943 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbqf\" (UniqueName: \"kubernetes.io/projected/0d53e726-f66e-4996-84fb-2f2547cadd29-kube-api-access-grbqf\") pod \"0d53e726-f66e-4996-84fb-2f2547cadd29\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.481020 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-catalog-content\") pod \"0d53e726-f66e-4996-84fb-2f2547cadd29\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.481206 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-utilities\") pod \"0d53e726-f66e-4996-84fb-2f2547cadd29\" (UID: \"0d53e726-f66e-4996-84fb-2f2547cadd29\") " Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.482220 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-utilities" (OuterVolumeSpecName: "utilities") pod "0d53e726-f66e-4996-84fb-2f2547cadd29" (UID: "0d53e726-f66e-4996-84fb-2f2547cadd29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.486134 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d53e726-f66e-4996-84fb-2f2547cadd29-kube-api-access-grbqf" (OuterVolumeSpecName: "kube-api-access-grbqf") pod "0d53e726-f66e-4996-84fb-2f2547cadd29" (UID: "0d53e726-f66e-4996-84fb-2f2547cadd29"). InnerVolumeSpecName "kube-api-access-grbqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.531677 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d53e726-f66e-4996-84fb-2f2547cadd29" (UID: "0d53e726-f66e-4996-84fb-2f2547cadd29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.583540 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbqf\" (UniqueName: \"kubernetes.io/projected/0d53e726-f66e-4996-84fb-2f2547cadd29-kube-api-access-grbqf\") on node \"crc\" DevicePath \"\"" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.583590 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.583604 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d53e726-f66e-4996-84fb-2f2547cadd29-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.846305 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klr7d" event={"ID":"0d53e726-f66e-4996-84fb-2f2547cadd29","Type":"ContainerDied","Data":"dc0a19ec7b4587b2e1794721655f1f30ca61ab62d049253d7b1c0a202d8024ba"} Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.846388 4772 scope.go:117] "RemoveContainer" containerID="28d4cd39be1138334db2b7783e62aa2ee0c778982c5eaf0a247d5529afdfc14d" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.846609 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klr7d" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.864328 4772 scope.go:117] "RemoveContainer" containerID="30792775c2c579d6914fb2fed483b2e9a5316c264bffd3bdc2010be3ded311b9" Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.877632 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-klr7d"] Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.888388 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-klr7d"] Jan 27 16:16:11 crc kubenswrapper[4772]: I0127 16:16:11.994755 4772 scope.go:117] "RemoveContainer" containerID="dc989b02724c66a718647f7d51ad4282c746eb41e28bfd955a90b8dfc944f809" Jan 27 16:16:12 crc kubenswrapper[4772]: I0127 16:16:12.673586 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" path="/var/lib/kubelet/pods/0d53e726-f66e-4996-84fb-2f2547cadd29/volumes" Jan 27 16:16:12 crc kubenswrapper[4772]: I0127 16:16:12.853459 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pqs7f" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="registry-server" containerID="cri-o://3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf" gracePeriod=2 Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.336797 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.410376 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgnx5\" (UniqueName: \"kubernetes.io/projected/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-kube-api-access-tgnx5\") pod \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.410433 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-catalog-content\") pod \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.410497 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-utilities\") pod \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\" (UID: \"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63\") " Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.411758 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-utilities" (OuterVolumeSpecName: "utilities") pod "b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" (UID: "b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.414739 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-kube-api-access-tgnx5" (OuterVolumeSpecName: "kube-api-access-tgnx5") pod "b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" (UID: "b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63"). InnerVolumeSpecName "kube-api-access-tgnx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.511716 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgnx5\" (UniqueName: \"kubernetes.io/projected/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-kube-api-access-tgnx5\") on node \"crc\" DevicePath \"\"" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.511753 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.797403 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" (UID: "b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.815680 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.862067 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerID="3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf" exitCode=0 Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.862159 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerDied","Data":"3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf"} Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.862192 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqs7f" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.862243 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqs7f" event={"ID":"b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63","Type":"ContainerDied","Data":"a901b081ae362cf564826808ed535c7fccd3df2a57e808801afadcbe086a2b9b"} Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.862270 4772 scope.go:117] "RemoveContainer" containerID="3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.882449 4772 scope.go:117] "RemoveContainer" containerID="f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.903943 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqs7f"] Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.909125 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pqs7f"] Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.910114 4772 scope.go:117] "RemoveContainer" containerID="6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.935201 4772 scope.go:117] "RemoveContainer" containerID="3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf" Jan 27 16:16:13 crc kubenswrapper[4772]: E0127 16:16:13.935757 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf\": container with ID starting with 3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf not found: ID does not exist" containerID="3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.935796 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf"} err="failed to get container status \"3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf\": rpc error: code = NotFound desc = could not find container \"3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf\": container with ID starting with 3ff98be81629d59868925742371033e6e9018417b497cb5940b98d61c9f755cf not found: ID does not exist" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.935817 4772 scope.go:117] "RemoveContainer" containerID="f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28" Jan 27 16:16:13 crc kubenswrapper[4772]: E0127 16:16:13.936155 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28\": container with ID starting with f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28 not found: ID does not exist" containerID="f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.936235 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28"} err="failed to get container status \"f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28\": rpc error: code = NotFound desc = could not find container \"f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28\": container with ID starting with f12cc8ff7554ccc6c80e45264e31a5ca39fa255c5b86b2f701314ef2b2c08a28 not found: ID does not exist" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.936284 4772 scope.go:117] "RemoveContainer" containerID="6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522" Jan 27 16:16:13 crc kubenswrapper[4772]: E0127 16:16:13.936548 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522\": container with ID starting with 6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522 not found: ID does not exist" containerID="6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522" Jan 27 16:16:13 crc kubenswrapper[4772]: I0127 16:16:13.936579 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522"} err="failed to get container status \"6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522\": rpc error: code = NotFound desc = could not find container \"6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522\": container with ID starting with 6c518bdaf4840ef38edee20a0af898e5c9269c61f9c8cfe19cad3cd8770c5522 not found: ID does not exist" Jan 27 16:16:14 crc kubenswrapper[4772]: I0127 16:16:14.675905 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" path="/var/lib/kubelet/pods/b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63/volumes" Jan 27 16:17:12 crc kubenswrapper[4772]: I0127 16:17:12.058349 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:17:12 crc kubenswrapper[4772]: I0127 16:17:12.059154 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:17:42 crc kubenswrapper[4772]: I0127 16:17:42.059217 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:17:42 crc kubenswrapper[4772]: I0127 16:17:42.059784 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.058707 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.059301 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.059345 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.059899 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.059946 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" gracePeriod=600 Jan 27 16:18:12 crc kubenswrapper[4772]: E0127 16:18:12.188522 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.769899 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" exitCode=0 Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.770003 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d"} Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.770142 4772 scope.go:117] "RemoveContainer" containerID="50cc7379b98d80b676c41cd73458def5401d0a1c59e714228729e6bc1cefe905" Jan 27 16:18:12 crc kubenswrapper[4772]: I0127 16:18:12.771258 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:18:12 crc kubenswrapper[4772]: E0127 16:18:12.771666 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:18:27 crc kubenswrapper[4772]: I0127 16:18:27.663598 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:18:27 crc kubenswrapper[4772]: E0127 16:18:27.664508 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:18:42 crc kubenswrapper[4772]: I0127 16:18:42.663483 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:18:42 crc kubenswrapper[4772]: E0127 16:18:42.664540 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:18:53 crc kubenswrapper[4772]: I0127 16:18:53.667423 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:18:53 crc kubenswrapper[4772]: E0127 16:18:53.669531 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:19:06 crc kubenswrapper[4772]: I0127 16:19:06.664753 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:19:06 crc kubenswrapper[4772]: E0127 16:19:06.665842 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:19:17 crc kubenswrapper[4772]: I0127 16:19:17.663268 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:19:17 crc kubenswrapper[4772]: E0127 16:19:17.664006 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:19:28 crc kubenswrapper[4772]: I0127 16:19:28.663658 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:19:28 crc kubenswrapper[4772]: E0127 16:19:28.664463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:19:41 crc kubenswrapper[4772]: I0127 16:19:41.663619 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:19:41 crc kubenswrapper[4772]: E0127 16:19:41.664447 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:19:56 crc kubenswrapper[4772]: I0127 16:19:56.663423 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:19:56 crc kubenswrapper[4772]: E0127 16:19:56.664228 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:20:07 crc kubenswrapper[4772]: I0127 16:20:07.663599 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:20:07 crc kubenswrapper[4772]: E0127 16:20:07.664419 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:20:20 crc kubenswrapper[4772]: I0127 16:20:20.663473 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:20:20 crc kubenswrapper[4772]: E0127 16:20:20.664417 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:20:32 crc kubenswrapper[4772]: I0127 16:20:32.662968 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:20:32 crc kubenswrapper[4772]: E0127 16:20:32.664131 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.051941 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rd4fm"] Jan 27 16:20:43 crc kubenswrapper[4772]: E0127 16:20:43.052830 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="extract-utilities" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.052848 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="extract-utilities" Jan 27 16:20:43 crc kubenswrapper[4772]: E0127 16:20:43.052865 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="extract-content" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.052872 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="extract-content" Jan 27 16:20:43 crc kubenswrapper[4772]: E0127 16:20:43.052891 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="extract-content" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.052899 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="extract-content" Jan 27 16:20:43 crc kubenswrapper[4772]: E0127 16:20:43.052909 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="registry-server" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.052918 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="registry-server" Jan 27 16:20:43 crc kubenswrapper[4772]: E0127 16:20:43.052928 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="extract-utilities" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.052934 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="extract-utilities" Jan 27 16:20:43 crc kubenswrapper[4772]: E0127 16:20:43.052950 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="registry-server" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.052957 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="registry-server" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.053123 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d3806b-ef7d-43c0-9c33-e08d0f8dcb63" containerName="registry-server" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.053136 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d53e726-f66e-4996-84fb-2f2547cadd29" containerName="registry-server" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.054634 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.066101 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd4fm"] Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.237233 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-catalog-content\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.237462 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-utilities\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.237648 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfjhc\" (UniqueName: \"kubernetes.io/projected/47938e85-9f22-40fc-a57a-6ba9649553eb-kube-api-access-sfjhc\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.338719 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfjhc\" (UniqueName: \"kubernetes.io/projected/47938e85-9f22-40fc-a57a-6ba9649553eb-kube-api-access-sfjhc\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.339093 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-catalog-content\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.339308 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-utilities\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.339626 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-catalog-content\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.339770 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-utilities\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.370281 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfjhc\" (UniqueName: \"kubernetes.io/projected/47938e85-9f22-40fc-a57a-6ba9649553eb-kube-api-access-sfjhc\") pod \"redhat-marketplace-rd4fm\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.405732 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.843278 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd4fm"] Jan 27 16:20:43 crc kubenswrapper[4772]: I0127 16:20:43.982544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerStarted","Data":"53686951f0ef185c4075e7e81b31f486c3ba81d625f01a117518d6499db1b0d8"} Jan 27 16:20:44 crc kubenswrapper[4772]: E0127 16:20:44.214731 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47938e85_9f22_40fc_a57a_6ba9649553eb.slice/crio-b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc.scope\": RecentStats: unable to find data in memory cache]" Jan 27 16:20:44 crc kubenswrapper[4772]: I0127 16:20:44.996481 4772 generic.go:334] "Generic (PLEG): container finished" podID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerID="b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc" exitCode=0 Jan 27 16:20:44 crc kubenswrapper[4772]: I0127 16:20:44.996580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerDied","Data":"b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc"} Jan 27 16:20:45 crc kubenswrapper[4772]: I0127 16:20:45.663402 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:20:45 crc kubenswrapper[4772]: E0127 16:20:45.663692 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:20:46 crc kubenswrapper[4772]: I0127 16:20:46.005933 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerStarted","Data":"1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78"} Jan 27 16:20:47 crc kubenswrapper[4772]: I0127 16:20:47.013809 4772 generic.go:334] "Generic (PLEG): container finished" podID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerID="1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78" exitCode=0 Jan 27 16:20:47 crc kubenswrapper[4772]: I0127 16:20:47.013859 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerDied","Data":"1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78"} Jan 27 16:20:48 crc kubenswrapper[4772]: I0127 16:20:48.024793 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerStarted","Data":"2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0"} Jan 27 16:20:48 crc kubenswrapper[4772]: I0127 16:20:48.042153 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rd4fm" podStartSLOduration=2.589876625 podStartE2EDuration="5.042134067s" podCreationTimestamp="2026-01-27 16:20:43 +0000 UTC" firstStartedPulling="2026-01-27 16:20:45.000552811 +0000 UTC m=+4430.981161919" lastFinishedPulling="2026-01-27 16:20:47.452810263 +0000 UTC m=+4433.433419361" observedRunningTime="2026-01-27 16:20:48.039974575 +0000 UTC m=+4434.020583703" watchObservedRunningTime="2026-01-27 16:20:48.042134067 +0000 UTC m=+4434.022743185" Jan 27 16:20:53 crc kubenswrapper[4772]: I0127 16:20:53.406845 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:53 crc kubenswrapper[4772]: I0127 16:20:53.407690 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:53 crc kubenswrapper[4772]: I0127 16:20:53.464420 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:54 crc kubenswrapper[4772]: I0127 16:20:54.186254 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:54 crc kubenswrapper[4772]: I0127 16:20:54.252101 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd4fm"] Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.145217 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rd4fm" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="registry-server" containerID="cri-o://2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0" gracePeriod=2 Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.554673 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.633021 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-utilities\") pod \"47938e85-9f22-40fc-a57a-6ba9649553eb\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.633070 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-catalog-content\") pod \"47938e85-9f22-40fc-a57a-6ba9649553eb\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.634282 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-utilities" (OuterVolumeSpecName: "utilities") pod "47938e85-9f22-40fc-a57a-6ba9649553eb" (UID: "47938e85-9f22-40fc-a57a-6ba9649553eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.662911 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:20:56 crc kubenswrapper[4772]: E0127 16:20:56.663314 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.665283 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47938e85-9f22-40fc-a57a-6ba9649553eb" (UID: "47938e85-9f22-40fc-a57a-6ba9649553eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.733853 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfjhc\" (UniqueName: \"kubernetes.io/projected/47938e85-9f22-40fc-a57a-6ba9649553eb-kube-api-access-sfjhc\") pod \"47938e85-9f22-40fc-a57a-6ba9649553eb\" (UID: \"47938e85-9f22-40fc-a57a-6ba9649553eb\") " Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.734204 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.734226 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47938e85-9f22-40fc-a57a-6ba9649553eb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.740227 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47938e85-9f22-40fc-a57a-6ba9649553eb-kube-api-access-sfjhc" (OuterVolumeSpecName: "kube-api-access-sfjhc") pod "47938e85-9f22-40fc-a57a-6ba9649553eb" (UID: "47938e85-9f22-40fc-a57a-6ba9649553eb"). InnerVolumeSpecName "kube-api-access-sfjhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:20:56 crc kubenswrapper[4772]: I0127 16:20:56.835436 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfjhc\" (UniqueName: \"kubernetes.io/projected/47938e85-9f22-40fc-a57a-6ba9649553eb-kube-api-access-sfjhc\") on node \"crc\" DevicePath \"\"" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.153224 4772 generic.go:334] "Generic (PLEG): container finished" podID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerID="2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0" exitCode=0 Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.153272 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerDied","Data":"2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0"} Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.153306 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rd4fm" event={"ID":"47938e85-9f22-40fc-a57a-6ba9649553eb","Type":"ContainerDied","Data":"53686951f0ef185c4075e7e81b31f486c3ba81d625f01a117518d6499db1b0d8"} Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.153310 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rd4fm" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.153326 4772 scope.go:117] "RemoveContainer" containerID="2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.182062 4772 scope.go:117] "RemoveContainer" containerID="1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.187401 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd4fm"] Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.192108 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rd4fm"] Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.209717 4772 scope.go:117] "RemoveContainer" containerID="b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.237947 4772 scope.go:117] "RemoveContainer" containerID="2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0" Jan 27 16:20:57 crc kubenswrapper[4772]: E0127 16:20:57.238474 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0\": container with ID starting with 2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0 not found: ID does not exist" containerID="2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.238512 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0"} err="failed to get container status \"2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0\": rpc error: code = NotFound desc = could not find container \"2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0\": container with ID starting with 2049e20fe6b41b29f6a3b38b6b5f1356130ccd8fa55ed8d34829d3cca6d72ae0 not found: ID does not exist" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.238533 4772 scope.go:117] "RemoveContainer" containerID="1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78" Jan 27 16:20:57 crc kubenswrapper[4772]: E0127 16:20:57.238932 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78\": container with ID starting with 1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78 not found: ID does not exist" containerID="1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.238951 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78"} err="failed to get container status \"1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78\": rpc error: code = NotFound desc = could not find container \"1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78\": container with ID starting with 1f505067a0b82eec42c5b3819599d6f717108d18b5afebdd2d913d27113deb78 not found: ID does not exist" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.238963 4772 scope.go:117] "RemoveContainer" containerID="b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc" Jan 27 16:20:57 crc kubenswrapper[4772]: E0127 16:20:57.240763 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc\": container with ID starting with b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc not found: ID does not exist" containerID="b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc" Jan 27 16:20:57 crc kubenswrapper[4772]: I0127 16:20:57.240929 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc"} err="failed to get container status \"b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc\": rpc error: code = NotFound desc = could not find container \"b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc\": container with ID starting with b20566d45bb50067ab86a9301c3513da2cd70c1f536b0bb394b306a49b9d08fc not found: ID does not exist" Jan 27 16:20:58 crc kubenswrapper[4772]: I0127 16:20:58.678386 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" path="/var/lib/kubelet/pods/47938e85-9f22-40fc-a57a-6ba9649553eb/volumes" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.977792 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8cfm5"] Jan 27 16:21:00 crc kubenswrapper[4772]: E0127 16:21:00.978156 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="extract-utilities" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.978190 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="extract-utilities" Jan 27 16:21:00 crc kubenswrapper[4772]: E0127 16:21:00.978218 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="registry-server" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.978227 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="registry-server" Jan 27 16:21:00 crc kubenswrapper[4772]: E0127 16:21:00.978244 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="extract-content" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.978252 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="extract-content" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.978466 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="47938e85-9f22-40fc-a57a-6ba9649553eb" containerName="registry-server" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.979798 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.991992 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cfm5"] Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.995199 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-utilities\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.995402 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-catalog-content\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:00 crc kubenswrapper[4772]: I0127 16:21:00.995610 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9q7\" (UniqueName: \"kubernetes.io/projected/5854174b-59dc-43b7-aea1-321b0762e938-kube-api-access-tv9q7\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.097307 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-utilities\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.097382 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-catalog-content\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.097452 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9q7\" (UniqueName: \"kubernetes.io/projected/5854174b-59dc-43b7-aea1-321b0762e938-kube-api-access-tv9q7\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.098418 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-utilities\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.098682 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-catalog-content\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.117271 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9q7\" (UniqueName: \"kubernetes.io/projected/5854174b-59dc-43b7-aea1-321b0762e938-kube-api-access-tv9q7\") pod \"certified-operators-8cfm5\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.298092 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:01 crc kubenswrapper[4772]: I0127 16:21:01.557621 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cfm5"] Jan 27 16:21:02 crc kubenswrapper[4772]: I0127 16:21:02.191743 4772 generic.go:334] "Generic (PLEG): container finished" podID="5854174b-59dc-43b7-aea1-321b0762e938" containerID="a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894" exitCode=0 Jan 27 16:21:02 crc kubenswrapper[4772]: I0127 16:21:02.191853 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cfm5" event={"ID":"5854174b-59dc-43b7-aea1-321b0762e938","Type":"ContainerDied","Data":"a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894"} Jan 27 16:21:02 crc kubenswrapper[4772]: I0127 16:21:02.192134 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cfm5" event={"ID":"5854174b-59dc-43b7-aea1-321b0762e938","Type":"ContainerStarted","Data":"a25e0c08dab3d89878a502715c0666ee0ba53b97d33dc1e0735cbdd3a0adbe26"} Jan 27 16:21:02 crc kubenswrapper[4772]: I0127 16:21:02.193892 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:21:04 crc kubenswrapper[4772]: I0127 16:21:04.219029 4772 generic.go:334] "Generic (PLEG): container finished" podID="5854174b-59dc-43b7-aea1-321b0762e938" containerID="09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84" exitCode=0 Jan 27 16:21:04 crc kubenswrapper[4772]: I0127 16:21:04.219098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cfm5" event={"ID":"5854174b-59dc-43b7-aea1-321b0762e938","Type":"ContainerDied","Data":"09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84"} Jan 27 16:21:05 crc kubenswrapper[4772]: I0127 16:21:05.228900 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cfm5" event={"ID":"5854174b-59dc-43b7-aea1-321b0762e938","Type":"ContainerStarted","Data":"64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af"} Jan 27 16:21:05 crc kubenswrapper[4772]: I0127 16:21:05.251978 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8cfm5" podStartSLOduration=2.48031297 podStartE2EDuration="5.251961878s" podCreationTimestamp="2026-01-27 16:21:00 +0000 UTC" firstStartedPulling="2026-01-27 16:21:02.193654958 +0000 UTC m=+4448.174264056" lastFinishedPulling="2026-01-27 16:21:04.965303826 +0000 UTC m=+4450.945912964" observedRunningTime="2026-01-27 16:21:05.249311372 +0000 UTC m=+4451.229920470" watchObservedRunningTime="2026-01-27 16:21:05.251961878 +0000 UTC m=+4451.232570976" Jan 27 16:21:07 crc kubenswrapper[4772]: I0127 16:21:07.662665 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:21:07 crc kubenswrapper[4772]: E0127 16:21:07.663198 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:21:11 crc kubenswrapper[4772]: I0127 16:21:11.298473 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:11 crc kubenswrapper[4772]: I0127 16:21:11.300284 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:11 crc kubenswrapper[4772]: I0127 16:21:11.346822 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:12 crc kubenswrapper[4772]: I0127 16:21:12.315269 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:12 crc kubenswrapper[4772]: I0127 16:21:12.764773 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cfm5"] Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.292340 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8cfm5" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="registry-server" containerID="cri-o://64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af" gracePeriod=2 Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.779794 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.909221 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-catalog-content\") pod \"5854174b-59dc-43b7-aea1-321b0762e938\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.909288 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv9q7\" (UniqueName: \"kubernetes.io/projected/5854174b-59dc-43b7-aea1-321b0762e938-kube-api-access-tv9q7\") pod \"5854174b-59dc-43b7-aea1-321b0762e938\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.909328 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-utilities\") pod \"5854174b-59dc-43b7-aea1-321b0762e938\" (UID: \"5854174b-59dc-43b7-aea1-321b0762e938\") " Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.910851 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-utilities" (OuterVolumeSpecName: "utilities") pod "5854174b-59dc-43b7-aea1-321b0762e938" (UID: "5854174b-59dc-43b7-aea1-321b0762e938"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:21:14 crc kubenswrapper[4772]: I0127 16:21:14.915072 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5854174b-59dc-43b7-aea1-321b0762e938-kube-api-access-tv9q7" (OuterVolumeSpecName: "kube-api-access-tv9q7") pod "5854174b-59dc-43b7-aea1-321b0762e938" (UID: "5854174b-59dc-43b7-aea1-321b0762e938"). InnerVolumeSpecName "kube-api-access-tv9q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.011624 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv9q7\" (UniqueName: \"kubernetes.io/projected/5854174b-59dc-43b7-aea1-321b0762e938-kube-api-access-tv9q7\") on node \"crc\" DevicePath \"\"" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.011672 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.113727 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5854174b-59dc-43b7-aea1-321b0762e938" (UID: "5854174b-59dc-43b7-aea1-321b0762e938"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.214505 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5854174b-59dc-43b7-aea1-321b0762e938-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.300843 4772 generic.go:334] "Generic (PLEG): container finished" podID="5854174b-59dc-43b7-aea1-321b0762e938" containerID="64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af" exitCode=0 Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.300898 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cfm5" event={"ID":"5854174b-59dc-43b7-aea1-321b0762e938","Type":"ContainerDied","Data":"64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af"} Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.300931 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cfm5" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.300954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cfm5" event={"ID":"5854174b-59dc-43b7-aea1-321b0762e938","Type":"ContainerDied","Data":"a25e0c08dab3d89878a502715c0666ee0ba53b97d33dc1e0735cbdd3a0adbe26"} Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.300971 4772 scope.go:117] "RemoveContainer" containerID="64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.317738 4772 scope.go:117] "RemoveContainer" containerID="09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.345142 4772 scope.go:117] "RemoveContainer" containerID="a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.346256 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cfm5"] Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.359109 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8cfm5"] Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.365200 4772 scope.go:117] "RemoveContainer" containerID="64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af" Jan 27 16:21:15 crc kubenswrapper[4772]: E0127 16:21:15.365840 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af\": container with ID starting with 64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af not found: ID does not exist" containerID="64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.365971 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af"} err="failed to get container status \"64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af\": rpc error: code = NotFound desc = could not find container \"64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af\": container with ID starting with 64a13b2a934cc6a3e677cf6801cefaba161f5440fb5054836df1740f71d8d1af not found: ID does not exist" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.366054 4772 scope.go:117] "RemoveContainer" containerID="09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84" Jan 27 16:21:15 crc kubenswrapper[4772]: E0127 16:21:15.366484 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84\": container with ID starting with 09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84 not found: ID does not exist" containerID="09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.366530 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84"} err="failed to get container status \"09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84\": rpc error: code = NotFound desc = could not find container \"09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84\": container with ID starting with 09b6440ec58b2621d6bef188feee41fb9bd95c35cb9a627aa339612e9aef4d84 not found: ID does not exist" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.366558 4772 scope.go:117] "RemoveContainer" containerID="a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894" Jan 27 16:21:15 crc kubenswrapper[4772]: E0127 16:21:15.366831 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894\": container with ID starting with a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894 not found: ID does not exist" containerID="a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894" Jan 27 16:21:15 crc kubenswrapper[4772]: I0127 16:21:15.366921 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894"} err="failed to get container status \"a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894\": rpc error: code = NotFound desc = could not find container \"a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894\": container with ID starting with a6ca15ff68296c29c6cc30ddf45249de19f9df9c7b117f0c3db8c30a8ef62894 not found: ID does not exist" Jan 27 16:21:16 crc kubenswrapper[4772]: I0127 16:21:16.675285 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5854174b-59dc-43b7-aea1-321b0762e938" path="/var/lib/kubelet/pods/5854174b-59dc-43b7-aea1-321b0762e938/volumes" Jan 27 16:21:20 crc kubenswrapper[4772]: I0127 16:21:20.663278 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:21:20 crc kubenswrapper[4772]: E0127 16:21:20.664208 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:21:34 crc kubenswrapper[4772]: I0127 16:21:34.670583 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:21:34 crc kubenswrapper[4772]: E0127 16:21:34.671314 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:21:48 crc kubenswrapper[4772]: I0127 16:21:48.663646 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:21:48 crc kubenswrapper[4772]: E0127 16:21:48.664500 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:22:02 crc kubenswrapper[4772]: I0127 16:22:02.663695 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:22:02 crc kubenswrapper[4772]: E0127 16:22:02.664639 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:22:13 crc kubenswrapper[4772]: I0127 16:22:13.663487 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:22:13 crc kubenswrapper[4772]: E0127 16:22:13.664394 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:22:27 crc kubenswrapper[4772]: I0127 16:22:27.663201 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:22:27 crc kubenswrapper[4772]: E0127 16:22:27.663982 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.197229 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-pmrs5"] Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.204183 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-pmrs5"] Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.314393 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-zb2ds"] Jan 27 16:22:38 crc kubenswrapper[4772]: E0127 16:22:38.318620 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="extract-utilities" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.318880 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="extract-utilities" Jan 27 16:22:38 crc kubenswrapper[4772]: E0127 16:22:38.318960 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="extract-content" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.319036 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="extract-content" Jan 27 16:22:38 crc kubenswrapper[4772]: E0127 16:22:38.319100 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="registry-server" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.319157 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="registry-server" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.319379 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5854174b-59dc-43b7-aea1-321b0762e938" containerName="registry-server" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.319934 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.322277 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.322626 4772 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-r9nmx" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.322969 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.323059 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.336373 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zb2ds"] Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.397827 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64tn\" (UniqueName: \"kubernetes.io/projected/edf192be-2985-42cc-94da-3e4523dffe67-kube-api-access-s64tn\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.398133 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edf192be-2985-42cc-94da-3e4523dffe67-crc-storage\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.398304 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edf192be-2985-42cc-94da-3e4523dffe67-node-mnt\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.499594 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64tn\" (UniqueName: \"kubernetes.io/projected/edf192be-2985-42cc-94da-3e4523dffe67-kube-api-access-s64tn\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.500007 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edf192be-2985-42cc-94da-3e4523dffe67-crc-storage\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.500309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edf192be-2985-42cc-94da-3e4523dffe67-node-mnt\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.500608 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edf192be-2985-42cc-94da-3e4523dffe67-node-mnt\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.501516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edf192be-2985-42cc-94da-3e4523dffe67-crc-storage\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.523842 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64tn\" (UniqueName: \"kubernetes.io/projected/edf192be-2985-42cc-94da-3e4523dffe67-kube-api-access-s64tn\") pod \"crc-storage-crc-zb2ds\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.645011 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:38 crc kubenswrapper[4772]: I0127 16:22:38.673401 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a5cafc-0519-4e90-9456-acb182176c41" path="/var/lib/kubelet/pods/32a5cafc-0519-4e90-9456-acb182176c41/volumes" Jan 27 16:22:39 crc kubenswrapper[4772]: I0127 16:22:39.064815 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-zb2ds"] Jan 27 16:22:39 crc kubenswrapper[4772]: I0127 16:22:39.936837 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zb2ds" event={"ID":"edf192be-2985-42cc-94da-3e4523dffe67","Type":"ContainerStarted","Data":"a88311208e71190cb8b4c34f99be89fcb0a6b7eb71b5dd4a8ea37aa6dbbc506b"} Jan 27 16:22:40 crc kubenswrapper[4772]: I0127 16:22:40.663456 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:22:40 crc kubenswrapper[4772]: E0127 16:22:40.663900 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:22:40 crc kubenswrapper[4772]: I0127 16:22:40.945517 4772 generic.go:334] "Generic (PLEG): container finished" podID="edf192be-2985-42cc-94da-3e4523dffe67" containerID="a62dadf36906064bb1b0580332d53e82d1766f5edc230560463a0bad481701be" exitCode=0 Jan 27 16:22:40 crc kubenswrapper[4772]: I0127 16:22:40.945700 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zb2ds" event={"ID":"edf192be-2985-42cc-94da-3e4523dffe67","Type":"ContainerDied","Data":"a62dadf36906064bb1b0580332d53e82d1766f5edc230560463a0bad481701be"} Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.352260 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.459064 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s64tn\" (UniqueName: \"kubernetes.io/projected/edf192be-2985-42cc-94da-3e4523dffe67-kube-api-access-s64tn\") pod \"edf192be-2985-42cc-94da-3e4523dffe67\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.459604 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edf192be-2985-42cc-94da-3e4523dffe67-node-mnt\") pod \"edf192be-2985-42cc-94da-3e4523dffe67\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.459686 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/edf192be-2985-42cc-94da-3e4523dffe67-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "edf192be-2985-42cc-94da-3e4523dffe67" (UID: "edf192be-2985-42cc-94da-3e4523dffe67"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.459747 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edf192be-2985-42cc-94da-3e4523dffe67-crc-storage\") pod \"edf192be-2985-42cc-94da-3e4523dffe67\" (UID: \"edf192be-2985-42cc-94da-3e4523dffe67\") " Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.460047 4772 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/edf192be-2985-42cc-94da-3e4523dffe67-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.466325 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf192be-2985-42cc-94da-3e4523dffe67-kube-api-access-s64tn" (OuterVolumeSpecName: "kube-api-access-s64tn") pod "edf192be-2985-42cc-94da-3e4523dffe67" (UID: "edf192be-2985-42cc-94da-3e4523dffe67"). InnerVolumeSpecName "kube-api-access-s64tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.488690 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edf192be-2985-42cc-94da-3e4523dffe67-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "edf192be-2985-42cc-94da-3e4523dffe67" (UID: "edf192be-2985-42cc-94da-3e4523dffe67"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.561097 4772 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/edf192be-2985-42cc-94da-3e4523dffe67-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.561136 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s64tn\" (UniqueName: \"kubernetes.io/projected/edf192be-2985-42cc-94da-3e4523dffe67-kube-api-access-s64tn\") on node \"crc\" DevicePath \"\"" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.963377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-zb2ds" event={"ID":"edf192be-2985-42cc-94da-3e4523dffe67","Type":"ContainerDied","Data":"a88311208e71190cb8b4c34f99be89fcb0a6b7eb71b5dd4a8ea37aa6dbbc506b"} Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.963420 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a88311208e71190cb8b4c34f99be89fcb0a6b7eb71b5dd4a8ea37aa6dbbc506b" Jan 27 16:22:42 crc kubenswrapper[4772]: I0127 16:22:42.963535 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-zb2ds" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.454958 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-zb2ds"] Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.461442 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-zb2ds"] Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.580903 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bnw5d"] Jan 27 16:22:44 crc kubenswrapper[4772]: E0127 16:22:44.581283 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edf192be-2985-42cc-94da-3e4523dffe67" containerName="storage" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.581310 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf192be-2985-42cc-94da-3e4523dffe67" containerName="storage" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.581499 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="edf192be-2985-42cc-94da-3e4523dffe67" containerName="storage" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.582064 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.584224 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.584284 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.585799 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.585881 4772 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-r9nmx" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.591668 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bnw5d"] Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.671856 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edf192be-2985-42cc-94da-3e4523dffe67" path="/var/lib/kubelet/pods/edf192be-2985-42cc-94da-3e4523dffe67/volumes" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.691525 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-node-mnt\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.691727 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpt6h\" (UniqueName: \"kubernetes.io/projected/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-kube-api-access-wpt6h\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.691790 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-crc-storage\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.792771 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-node-mnt\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.792908 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpt6h\" (UniqueName: \"kubernetes.io/projected/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-kube-api-access-wpt6h\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.792940 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-crc-storage\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.793115 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-node-mnt\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.793987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-crc-storage\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.813130 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpt6h\" (UniqueName: \"kubernetes.io/projected/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-kube-api-access-wpt6h\") pod \"crc-storage-crc-bnw5d\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:44 crc kubenswrapper[4772]: I0127 16:22:44.900186 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:45 crc kubenswrapper[4772]: I0127 16:22:45.512346 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bnw5d"] Jan 27 16:22:45 crc kubenswrapper[4772]: I0127 16:22:45.987631 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bnw5d" event={"ID":"764343c1-a6b8-4600-8cb4-a0aa386d4cd2","Type":"ContainerStarted","Data":"5806739c0c411d54a997b2b6fdd6d93080c6d25dfe4b6a90ab2738dfa909d575"} Jan 27 16:22:48 crc kubenswrapper[4772]: I0127 16:22:48.001570 4772 generic.go:334] "Generic (PLEG): container finished" podID="764343c1-a6b8-4600-8cb4-a0aa386d4cd2" containerID="657bc401ac778c0e03331638c4f0d729dea942292c16996b872934a4ca92ef14" exitCode=0 Jan 27 16:22:48 crc kubenswrapper[4772]: I0127 16:22:48.001656 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bnw5d" event={"ID":"764343c1-a6b8-4600-8cb4-a0aa386d4cd2","Type":"ContainerDied","Data":"657bc401ac778c0e03331638c4f0d729dea942292c16996b872934a4ca92ef14"} Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.360155 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.502209 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-crc-storage\") pod \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.502272 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-node-mnt\") pod \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.502307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpt6h\" (UniqueName: \"kubernetes.io/projected/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-kube-api-access-wpt6h\") pod \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\" (UID: \"764343c1-a6b8-4600-8cb4-a0aa386d4cd2\") " Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.502408 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "764343c1-a6b8-4600-8cb4-a0aa386d4cd2" (UID: "764343c1-a6b8-4600-8cb4-a0aa386d4cd2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.502573 4772 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-node-mnt\") on node \"crc\" DevicePath \"\"" Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.511456 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-kube-api-access-wpt6h" (OuterVolumeSpecName: "kube-api-access-wpt6h") pod "764343c1-a6b8-4600-8cb4-a0aa386d4cd2" (UID: "764343c1-a6b8-4600-8cb4-a0aa386d4cd2"). InnerVolumeSpecName "kube-api-access-wpt6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.522717 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "764343c1-a6b8-4600-8cb4-a0aa386d4cd2" (UID: "764343c1-a6b8-4600-8cb4-a0aa386d4cd2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.606283 4772 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-crc-storage\") on node \"crc\" DevicePath \"\"" Jan 27 16:22:49 crc kubenswrapper[4772]: I0127 16:22:49.606325 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpt6h\" (UniqueName: \"kubernetes.io/projected/764343c1-a6b8-4600-8cb4-a0aa386d4cd2-kube-api-access-wpt6h\") on node \"crc\" DevicePath \"\"" Jan 27 16:22:50 crc kubenswrapper[4772]: I0127 16:22:50.015222 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bnw5d" event={"ID":"764343c1-a6b8-4600-8cb4-a0aa386d4cd2","Type":"ContainerDied","Data":"5806739c0c411d54a997b2b6fdd6d93080c6d25dfe4b6a90ab2738dfa909d575"} Jan 27 16:22:50 crc kubenswrapper[4772]: I0127 16:22:50.015269 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5806739c0c411d54a997b2b6fdd6d93080c6d25dfe4b6a90ab2738dfa909d575" Jan 27 16:22:50 crc kubenswrapper[4772]: I0127 16:22:50.015394 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bnw5d" Jan 27 16:22:52 crc kubenswrapper[4772]: I0127 16:22:52.663314 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:22:52 crc kubenswrapper[4772]: E0127 16:22:52.664317 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:23:06 crc kubenswrapper[4772]: I0127 16:23:06.192801 4772 scope.go:117] "RemoveContainer" containerID="f4e8f8b6e9c9139e4588eb373fc8616c60521a1a5d0cbfb79f4f8c9d4dc676b9" Jan 27 16:23:07 crc kubenswrapper[4772]: I0127 16:23:07.662965 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:23:07 crc kubenswrapper[4772]: E0127 16:23:07.663483 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:23:18 crc kubenswrapper[4772]: I0127 16:23:18.663406 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:23:19 crc kubenswrapper[4772]: I0127 16:23:19.243759 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"f3a08a71f69d769f4a6a29d6cef13873c9dceaa6515bc086fcafc82c5f73a041"} Jan 27 16:25:42 crc kubenswrapper[4772]: I0127 16:25:42.058529 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:25:42 crc kubenswrapper[4772]: I0127 16:25:42.059100 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.249918 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-rvc76"] Jan 27 16:25:57 crc kubenswrapper[4772]: E0127 16:25:57.254680 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="764343c1-a6b8-4600-8cb4-a0aa386d4cd2" containerName="storage" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.254702 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="764343c1-a6b8-4600-8cb4-a0aa386d4cd2" containerName="storage" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.254868 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="764343c1-a6b8-4600-8cb4-a0aa386d4cd2" containerName="storage" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.255781 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.262427 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.262530 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.262696 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.266485 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bzcld" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.266751 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.273785 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-rvc76"] Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.384820 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566ms\" (UniqueName: \"kubernetes.io/projected/67aa19ec-be98-4d61-b758-0fd0f7f77f42-kube-api-access-566ms\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.385060 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-config\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.385124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.486552 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-config\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.486635 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.486708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566ms\" (UniqueName: \"kubernetes.io/projected/67aa19ec-be98-4d61-b758-0fd0f7f77f42-kube-api-access-566ms\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.487475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-config\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.487514 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.520935 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566ms\" (UniqueName: \"kubernetes.io/projected/67aa19ec-be98-4d61-b758-0fd0f7f77f42-kube-api-access-566ms\") pod \"dnsmasq-dns-5d7b5456f5-rvc76\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.577453 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.613629 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mmzht"] Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.617329 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.629876 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mmzht"] Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.790051 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqg6h\" (UniqueName: \"kubernetes.io/projected/bb8b7780-142e-4fd6-967f-a42e112a0b2e-kube-api-access-cqg6h\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.790134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.790227 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-config\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.891634 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqg6h\" (UniqueName: \"kubernetes.io/projected/bb8b7780-142e-4fd6-967f-a42e112a0b2e-kube-api-access-cqg6h\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.891697 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.891723 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-config\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.892690 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-config\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.892955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.908625 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqg6h\" (UniqueName: \"kubernetes.io/projected/bb8b7780-142e-4fd6-967f-a42e112a0b2e-kube-api-access-cqg6h\") pod \"dnsmasq-dns-98ddfc8f-mmzht\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:57 crc kubenswrapper[4772]: I0127 16:25:57.964031 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.099989 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-rvc76"] Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.411078 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.412692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.417659 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.417785 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kn7v7" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.417912 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.418023 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.418057 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.426374 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mmzht"] Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.440310 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.474148 4772 generic.go:334] "Generic (PLEG): container finished" podID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerID="ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b" exitCode=0 Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.474237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" event={"ID":"67aa19ec-be98-4d61-b758-0fd0f7f77f42","Type":"ContainerDied","Data":"ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b"} Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.474266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" event={"ID":"67aa19ec-be98-4d61-b758-0fd0f7f77f42","Type":"ContainerStarted","Data":"855fc44f7201e7a6bbe04d32cb4e1bdbca9adcf771a78c25383154b1e38bbac4"} Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.477311 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" event={"ID":"bb8b7780-142e-4fd6-967f-a42e112a0b2e","Type":"ContainerStarted","Data":"78ea743dbcac291e88b556c9f353cf72b6797716babc1be9562796c00fc48cc9"} Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501533 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f38de303-3271-4d8a-b114-4fca1e36c6a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501582 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501618 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f38de303-3271-4d8a-b114-4fca1e36c6a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501638 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj9fs\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-kube-api-access-hj9fs\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501660 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501721 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501754 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.501969 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.502000 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.605936 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f38de303-3271-4d8a-b114-4fca1e36c6a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj9fs\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-kube-api-access-hj9fs\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606066 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f38de303-3271-4d8a-b114-4fca1e36c6a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606101 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606178 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606231 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606267 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.606309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.607302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.607312 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.607590 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.607758 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.612021 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f38de303-3271-4d8a-b114-4fca1e36c6a3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.612498 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.614358 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f38de303-3271-4d8a-b114-4fca1e36c6a3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.614620 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.614677 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/36a183de2c7abb5d8abee5f0c83592d4960d3f7dbd03e4d4afd32924fe238d72/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.634662 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj9fs\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-kube-api-access-hj9fs\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.659768 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.729215 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: E0127 16:25:58.753476 4772 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Jan 27 16:25:58 crc kubenswrapper[4772]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/67aa19ec-be98-4d61-b758-0fd0f7f77f42/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 27 16:25:58 crc kubenswrapper[4772]: > podSandboxID="855fc44f7201e7a6bbe04d32cb4e1bdbca9adcf771a78c25383154b1e38bbac4" Jan 27 16:25:58 crc kubenswrapper[4772]: E0127 16:25:58.753830 4772 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 27 16:25:58 crc kubenswrapper[4772]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-566ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-rvc76_openstack(67aa19ec-be98-4d61-b758-0fd0f7f77f42): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/67aa19ec-be98-4d61-b758-0fd0f7f77f42/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Jan 27 16:25:58 crc kubenswrapper[4772]: > logger="UnhandledError" Jan 27 16:25:58 crc kubenswrapper[4772]: E0127 16:25:58.755251 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/67aa19ec-be98-4d61-b758-0fd0f7f77f42/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.788906 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.790558 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.794743 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.794743 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.794851 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.794954 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-554gz" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.795332 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.809625 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.910269 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.910836 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.910881 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.910928 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qc4j\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-kube-api-access-8qc4j\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.910952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.911051 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.911091 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.911123 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:58 crc kubenswrapper[4772]: I0127 16:25:58.911151 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.012138 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.012215 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.012262 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qc4j\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-kube-api-access-8qc4j\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.012286 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.012390 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.013528 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.013614 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.013960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014098 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014125 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014159 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014441 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014471 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/74f6c4054e7c826304958ef416594be6d4b6260f90a6b43d068948e9c0dc0fa0/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014573 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.014897 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.020214 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.022219 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.022801 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.033362 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qc4j\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-kube-api-access-8qc4j\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.045882 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.192279 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.198542 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.484167 4772 generic.go:334] "Generic (PLEG): container finished" podID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerID="16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5" exitCode=0 Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.484294 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" event={"ID":"bb8b7780-142e-4fd6-967f-a42e112a0b2e","Type":"ContainerDied","Data":"16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5"} Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.487028 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f38de303-3271-4d8a-b114-4fca1e36c6a3","Type":"ContainerStarted","Data":"10715889097bdfd0b2f4c8a7bc95c59af267c1bbc009f6e01130ee1ccb028c38"} Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.624584 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.734228 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.736014 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.742244 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.742968 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.743100 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.743270 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.743563 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-c4cld" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.745085 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.835778 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.835833 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.835966 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqcw\" (UniqueName: \"kubernetes.io/projected/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-kube-api-access-xcqcw\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.836055 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.836082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.836234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.836273 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.836336 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937071 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937121 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937163 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937272 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937289 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.937317 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqcw\" (UniqueName: \"kubernetes.io/projected/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-kube-api-access-xcqcw\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.938382 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-config-data-default\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.939659 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-operator-scripts\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.941301 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-kolla-config\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.943471 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-config-data-generated\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.945529 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.945581 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5764f99e1512039e35052d84fb68ae5f0622866614c46458b82d06b647e9a2a6/globalmount\"" pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.945668 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.946985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.955813 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqcw\" (UniqueName: \"kubernetes.io/projected/eefd7ff4-5222-45cf-aaad-20ebfd50a2ff-kube-api-access-xcqcw\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:25:59 crc kubenswrapper[4772]: I0127 16:25:59.979132 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40b3b8a3-9743-4a00-b998-26333b0bfb3e\") pod \"openstack-galera-0\" (UID: \"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff\") " pod="openstack/openstack-galera-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.074127 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.075188 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.078869 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.079157 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gv4lb" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.085072 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.088493 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.141892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36e53353-e817-4d3d-878e-2b34f7c9192f-kolla-config\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.142007 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8pmp\" (UniqueName: \"kubernetes.io/projected/36e53353-e817-4d3d-878e-2b34f7c9192f-kube-api-access-z8pmp\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.142056 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36e53353-e817-4d3d-878e-2b34f7c9192f-config-data\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.243196 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36e53353-e817-4d3d-878e-2b34f7c9192f-kolla-config\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.243615 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8pmp\" (UniqueName: \"kubernetes.io/projected/36e53353-e817-4d3d-878e-2b34f7c9192f-kube-api-access-z8pmp\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.243670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36e53353-e817-4d3d-878e-2b34f7c9192f-config-data\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.245091 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/36e53353-e817-4d3d-878e-2b34f7c9192f-kolla-config\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.245112 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36e53353-e817-4d3d-878e-2b34f7c9192f-config-data\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.261145 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8pmp\" (UniqueName: \"kubernetes.io/projected/36e53353-e817-4d3d-878e-2b34f7c9192f-kube-api-access-z8pmp\") pod \"memcached-0\" (UID: \"36e53353-e817-4d3d-878e-2b34f7c9192f\") " pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.392144 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.508953 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" event={"ID":"bb8b7780-142e-4fd6-967f-a42e112a0b2e","Type":"ContainerStarted","Data":"9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94"} Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.509013 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.517170 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b","Type":"ContainerStarted","Data":"b96f230c3d38a86522fef8eccf06bafd1c3858ca3dbe42112ea60613f2b942f3"} Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.519317 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" event={"ID":"67aa19ec-be98-4d61-b758-0fd0f7f77f42","Type":"ContainerStarted","Data":"5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5"} Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.520527 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.528768 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" podStartSLOduration=3.528745176 podStartE2EDuration="3.528745176s" podCreationTimestamp="2026-01-27 16:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:00.526572213 +0000 UTC m=+4746.507181321" watchObservedRunningTime="2026-01-27 16:26:00.528745176 +0000 UTC m=+4746.509354274" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.552687 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" podStartSLOduration=3.552662405 podStartE2EDuration="3.552662405s" podCreationTimestamp="2026-01-27 16:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:00.547875397 +0000 UTC m=+4746.528484525" watchObservedRunningTime="2026-01-27 16:26:00.552662405 +0000 UTC m=+4746.533271513" Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.627231 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 16:26:00 crc kubenswrapper[4772]: W0127 16:26:00.699475 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeefd7ff4_5222_45cf_aaad_20ebfd50a2ff.slice/crio-9bd01a4fe738455348a843ce7b9fb4f1111f8cd4f653833eee18272b680d13f1 WatchSource:0}: Error finding container 9bd01a4fe738455348a843ce7b9fb4f1111f8cd4f653833eee18272b680d13f1: Status 404 returned error can't find the container with id 9bd01a4fe738455348a843ce7b9fb4f1111f8cd4f653833eee18272b680d13f1 Jan 27 16:26:00 crc kubenswrapper[4772]: I0127 16:26:00.822979 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 16:26:00 crc kubenswrapper[4772]: W0127 16:26:00.827776 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36e53353_e817_4d3d_878e_2b34f7c9192f.slice/crio-4022283e879f8ad28b8e8063f012de6623cca74757ef7b4e4af54829c24d00b0 WatchSource:0}: Error finding container 4022283e879f8ad28b8e8063f012de6623cca74757ef7b4e4af54829c24d00b0: Status 404 returned error can't find the container with id 4022283e879f8ad28b8e8063f012de6623cca74757ef7b4e4af54829c24d00b0 Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.236313 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.237838 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.242420 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.242651 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.242668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gbdqz" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.242810 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.251965 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258161 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258217 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7c27a4-64d2-4581-8271-5aaf74103b04-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258249 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258273 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be7c27a4-64d2-4581-8271-5aaf74103b04-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7c27a4-64d2-4581-8271-5aaf74103b04-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258331 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjj5\" (UniqueName: \"kubernetes.io/projected/be7c27a4-64d2-4581-8271-5aaf74103b04-kube-api-access-mvjj5\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258823 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.258880 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360497 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7c27a4-64d2-4581-8271-5aaf74103b04-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjj5\" (UniqueName: \"kubernetes.io/projected/be7c27a4-64d2-4581-8271-5aaf74103b04-kube-api-access-mvjj5\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360612 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360641 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360667 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360685 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7c27a4-64d2-4581-8271-5aaf74103b04-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.360727 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be7c27a4-64d2-4581-8271-5aaf74103b04-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.361141 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be7c27a4-64d2-4581-8271-5aaf74103b04-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.363209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.363558 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.364160 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be7c27a4-64d2-4581-8271-5aaf74103b04-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.364949 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.365023 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b9ee123aaeb872cc0de9d19f51d95a4c0a289a7f718b9f548cfb57c8ed00262/globalmount\"" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.367049 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be7c27a4-64d2-4581-8271-5aaf74103b04-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.369233 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be7c27a4-64d2-4581-8271-5aaf74103b04-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.381569 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjj5\" (UniqueName: \"kubernetes.io/projected/be7c27a4-64d2-4581-8271-5aaf74103b04-kube-api-access-mvjj5\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.386637 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eb9d0e9c-9416-463e-af03-585faac9acf4\") pod \"openstack-cell1-galera-0\" (UID: \"be7c27a4-64d2-4581-8271-5aaf74103b04\") " pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.536413 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff","Type":"ContainerStarted","Data":"319d2bf04906b7ea2601932e6eda46f58f2c951a1eef20989d0df442b8498b3f"} Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.536466 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff","Type":"ContainerStarted","Data":"9bd01a4fe738455348a843ce7b9fb4f1111f8cd4f653833eee18272b680d13f1"} Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.539971 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f38de303-3271-4d8a-b114-4fca1e36c6a3","Type":"ContainerStarted","Data":"69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355"} Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.541596 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b","Type":"ContainerStarted","Data":"480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb"} Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.543600 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"36e53353-e817-4d3d-878e-2b34f7c9192f","Type":"ContainerStarted","Data":"2aab1cfcdac8f489550d996cadf179ce2496b80131aeb9ca866dfb4aa5cfbdfe"} Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.543634 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"36e53353-e817-4d3d-878e-2b34f7c9192f","Type":"ContainerStarted","Data":"4022283e879f8ad28b8e8063f012de6623cca74757ef7b4e4af54829c24d00b0"} Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.544099 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.552639 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:01 crc kubenswrapper[4772]: I0127 16:26:01.584436 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.584418339 podStartE2EDuration="1.584418339s" podCreationTimestamp="2026-01-27 16:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:01.57646983 +0000 UTC m=+4747.557078938" watchObservedRunningTime="2026-01-27 16:26:01.584418339 +0000 UTC m=+4747.565027437" Jan 27 16:26:02 crc kubenswrapper[4772]: I0127 16:26:02.007357 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 16:26:02 crc kubenswrapper[4772]: I0127 16:26:02.551247 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be7c27a4-64d2-4581-8271-5aaf74103b04","Type":"ContainerStarted","Data":"5edb0a193224020eb6e999d1c89efe76ea6edde4800d8fbee94b0bbbba0c753e"} Jan 27 16:26:02 crc kubenswrapper[4772]: I0127 16:26:02.551733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be7c27a4-64d2-4581-8271-5aaf74103b04","Type":"ContainerStarted","Data":"22a6b00ac3e7621a53a0417bda2f532421c802801737fb94d2efef84383cc17a"} Jan 27 16:26:04 crc kubenswrapper[4772]: I0127 16:26:04.568768 4772 generic.go:334] "Generic (PLEG): container finished" podID="eefd7ff4-5222-45cf-aaad-20ebfd50a2ff" containerID="319d2bf04906b7ea2601932e6eda46f58f2c951a1eef20989d0df442b8498b3f" exitCode=0 Jan 27 16:26:04 crc kubenswrapper[4772]: I0127 16:26:04.569116 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff","Type":"ContainerDied","Data":"319d2bf04906b7ea2601932e6eda46f58f2c951a1eef20989d0df442b8498b3f"} Jan 27 16:26:05 crc kubenswrapper[4772]: I0127 16:26:05.393923 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 27 16:26:05 crc kubenswrapper[4772]: I0127 16:26:05.576633 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"eefd7ff4-5222-45cf-aaad-20ebfd50a2ff","Type":"ContainerStarted","Data":"31c0f86da05308b57c6fff62a573d0982988a4b2b9c9952ee68b4f90a379009a"} Jan 27 16:26:05 crc kubenswrapper[4772]: I0127 16:26:05.596329 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.596305481 podStartE2EDuration="7.596305481s" podCreationTimestamp="2026-01-27 16:25:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:05.596198278 +0000 UTC m=+4751.576807376" watchObservedRunningTime="2026-01-27 16:26:05.596305481 +0000 UTC m=+4751.576914579" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.226960 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rcww9"] Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.228524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.245611 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcww9"] Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.332239 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5cpj\" (UniqueName: \"kubernetes.io/projected/0f798a7f-3f79-4bad-81ce-927deac7748c-kube-api-access-w5cpj\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.332299 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-utilities\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.332363 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-catalog-content\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.434217 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-utilities\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.434647 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-catalog-content\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.434751 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5cpj\" (UniqueName: \"kubernetes.io/projected/0f798a7f-3f79-4bad-81ce-927deac7748c-kube-api-access-w5cpj\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.434911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-utilities\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.435028 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-catalog-content\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.458175 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5cpj\" (UniqueName: \"kubernetes.io/projected/0f798a7f-3f79-4bad-81ce-927deac7748c-kube-api-access-w5cpj\") pod \"redhat-operators-rcww9\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.548201 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.587413 4772 generic.go:334] "Generic (PLEG): container finished" podID="be7c27a4-64d2-4581-8271-5aaf74103b04" containerID="5edb0a193224020eb6e999d1c89efe76ea6edde4800d8fbee94b0bbbba0c753e" exitCode=0 Jan 27 16:26:06 crc kubenswrapper[4772]: I0127 16:26:06.587466 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be7c27a4-64d2-4581-8271-5aaf74103b04","Type":"ContainerDied","Data":"5edb0a193224020eb6e999d1c89efe76ea6edde4800d8fbee94b0bbbba0c753e"} Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.011386 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rcww9"] Jan 27 16:26:07 crc kubenswrapper[4772]: W0127 16:26:07.014415 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f798a7f_3f79_4bad_81ce_927deac7748c.slice/crio-968e17dd30db85c3b57c00c0b8a56c178398486e19424f28d377e321bb687c89 WatchSource:0}: Error finding container 968e17dd30db85c3b57c00c0b8a56c178398486e19424f28d377e321bb687c89: Status 404 returned error can't find the container with id 968e17dd30db85c3b57c00c0b8a56c178398486e19424f28d377e321bb687c89 Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.579499 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.597295 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerID="1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58" exitCode=0 Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.597396 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerDied","Data":"1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58"} Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.597487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerStarted","Data":"968e17dd30db85c3b57c00c0b8a56c178398486e19424f28d377e321bb687c89"} Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.599532 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.608730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"be7c27a4-64d2-4581-8271-5aaf74103b04","Type":"ContainerStarted","Data":"e7e64e0f199444d267c2d7492662163ae8edb4f6a71875f0a5607f2ec36a95ba"} Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.676712 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.6766865079999995 podStartE2EDuration="7.676686508s" podCreationTimestamp="2026-01-27 16:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:07.670141059 +0000 UTC m=+4753.650750167" watchObservedRunningTime="2026-01-27 16:26:07.676686508 +0000 UTC m=+4753.657295616" Jan 27 16:26:07 crc kubenswrapper[4772]: I0127 16:26:07.966359 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.021377 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-rvc76"] Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.021611 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerName="dnsmasq-dns" containerID="cri-o://5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5" gracePeriod=10 Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.466658 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.569715 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566ms\" (UniqueName: \"kubernetes.io/projected/67aa19ec-be98-4d61-b758-0fd0f7f77f42-kube-api-access-566ms\") pod \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.569859 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-config\") pod \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.569916 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-dns-svc\") pod \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\" (UID: \"67aa19ec-be98-4d61-b758-0fd0f7f77f42\") " Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.587207 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67aa19ec-be98-4d61-b758-0fd0f7f77f42-kube-api-access-566ms" (OuterVolumeSpecName: "kube-api-access-566ms") pod "67aa19ec-be98-4d61-b758-0fd0f7f77f42" (UID: "67aa19ec-be98-4d61-b758-0fd0f7f77f42"). InnerVolumeSpecName "kube-api-access-566ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.604228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-config" (OuterVolumeSpecName: "config") pod "67aa19ec-be98-4d61-b758-0fd0f7f77f42" (UID: "67aa19ec-be98-4d61-b758-0fd0f7f77f42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.608245 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67aa19ec-be98-4d61-b758-0fd0f7f77f42" (UID: "67aa19ec-be98-4d61-b758-0fd0f7f77f42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.618334 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerStarted","Data":"7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e"} Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.620532 4772 generic.go:334] "Generic (PLEG): container finished" podID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerID="5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5" exitCode=0 Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.620591 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" event={"ID":"67aa19ec-be98-4d61-b758-0fd0f7f77f42","Type":"ContainerDied","Data":"5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5"} Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.620624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" event={"ID":"67aa19ec-be98-4d61-b758-0fd0f7f77f42","Type":"ContainerDied","Data":"855fc44f7201e7a6bbe04d32cb4e1bdbca9adcf771a78c25383154b1e38bbac4"} Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.620646 4772 scope.go:117] "RemoveContainer" containerID="5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.620802 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-rvc76" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.675352 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.675386 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa19ec-be98-4d61-b758-0fd0f7f77f42-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.675399 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-566ms\" (UniqueName: \"kubernetes.io/projected/67aa19ec-be98-4d61-b758-0fd0f7f77f42-kube-api-access-566ms\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.680361 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-rvc76"] Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.681114 4772 scope.go:117] "RemoveContainer" containerID="ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.683047 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-rvc76"] Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.697706 4772 scope.go:117] "RemoveContainer" containerID="5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5" Jan 27 16:26:08 crc kubenswrapper[4772]: E0127 16:26:08.698151 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5\": container with ID starting with 5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5 not found: ID does not exist" containerID="5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.698212 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5"} err="failed to get container status \"5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5\": rpc error: code = NotFound desc = could not find container \"5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5\": container with ID starting with 5350f301f4b930945ac0ca2e13712ac15b2d17386ce0426b5517080245adcfb5 not found: ID does not exist" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.698273 4772 scope.go:117] "RemoveContainer" containerID="ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b" Jan 27 16:26:08 crc kubenswrapper[4772]: E0127 16:26:08.698640 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b\": container with ID starting with ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b not found: ID does not exist" containerID="ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b" Jan 27 16:26:08 crc kubenswrapper[4772]: I0127 16:26:08.698670 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b"} err="failed to get container status \"ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b\": rpc error: code = NotFound desc = could not find container \"ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b\": container with ID starting with ee06b3482fdb47bc70c8781ce53754e851c0140b4c8f1a7fccfeea4aa97ef68b not found: ID does not exist" Jan 27 16:26:09 crc kubenswrapper[4772]: I0127 16:26:09.629594 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerID="7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e" exitCode=0 Jan 27 16:26:09 crc kubenswrapper[4772]: I0127 16:26:09.629668 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerDied","Data":"7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e"} Jan 27 16:26:10 crc kubenswrapper[4772]: I0127 16:26:10.085864 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 27 16:26:10 crc kubenswrapper[4772]: I0127 16:26:10.086318 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 27 16:26:10 crc kubenswrapper[4772]: I0127 16:26:10.641838 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerStarted","Data":"c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d"} Jan 27 16:26:10 crc kubenswrapper[4772]: I0127 16:26:10.666603 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rcww9" podStartSLOduration=2.010051006 podStartE2EDuration="4.666383151s" podCreationTimestamp="2026-01-27 16:26:06 +0000 UTC" firstStartedPulling="2026-01-27 16:26:07.599263447 +0000 UTC m=+4753.579872535" lastFinishedPulling="2026-01-27 16:26:10.255595582 +0000 UTC m=+4756.236204680" observedRunningTime="2026-01-27 16:26:10.65942213 +0000 UTC m=+4756.640031228" watchObservedRunningTime="2026-01-27 16:26:10.666383151 +0000 UTC m=+4756.646992249" Jan 27 16:26:10 crc kubenswrapper[4772]: I0127 16:26:10.673218 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" path="/var/lib/kubelet/pods/67aa19ec-be98-4d61-b758-0fd0f7f77f42/volumes" Jan 27 16:26:11 crc kubenswrapper[4772]: I0127 16:26:11.554097 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:11 crc kubenswrapper[4772]: I0127 16:26:11.554154 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:12 crc kubenswrapper[4772]: I0127 16:26:12.020494 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:12 crc kubenswrapper[4772]: I0127 16:26:12.059053 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:26:12 crc kubenswrapper[4772]: I0127 16:26:12.059129 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:26:12 crc kubenswrapper[4772]: I0127 16:26:12.089565 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 27 16:26:12 crc kubenswrapper[4772]: I0127 16:26:12.369068 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 27 16:26:12 crc kubenswrapper[4772]: I0127 16:26:12.563437 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 27 16:26:16 crc kubenswrapper[4772]: I0127 16:26:16.549437 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:16 crc kubenswrapper[4772]: I0127 16:26:16.549800 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:16 crc kubenswrapper[4772]: I0127 16:26:16.599446 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:16 crc kubenswrapper[4772]: I0127 16:26:16.724333 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:16 crc kubenswrapper[4772]: I0127 16:26:16.833443 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcww9"] Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.692719 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-4pgr2"] Jan 27 16:26:18 crc kubenswrapper[4772]: E0127 16:26:18.693509 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerName="dnsmasq-dns" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.693532 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerName="dnsmasq-dns" Jan 27 16:26:18 crc kubenswrapper[4772]: E0127 16:26:18.693562 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerName="init" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.693574 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerName="init" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.694059 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aa19ec-be98-4d61-b758-0fd0f7f77f42" containerName="dnsmasq-dns" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.695082 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.695609 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rcww9" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="registry-server" containerID="cri-o://c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d" gracePeriod=2 Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.702785 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.713597 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4pgr2"] Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.822531 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-operator-scripts\") pod \"root-account-create-update-4pgr2\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.822629 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2t2c\" (UniqueName: \"kubernetes.io/projected/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-kube-api-access-z2t2c\") pod \"root-account-create-update-4pgr2\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.923994 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-operator-scripts\") pod \"root-account-create-update-4pgr2\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.924073 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2t2c\" (UniqueName: \"kubernetes.io/projected/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-kube-api-access-z2t2c\") pod \"root-account-create-update-4pgr2\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.924905 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-operator-scripts\") pod \"root-account-create-update-4pgr2\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:18 crc kubenswrapper[4772]: I0127 16:26:18.952918 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2t2c\" (UniqueName: \"kubernetes.io/projected/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-kube-api-access-z2t2c\") pod \"root-account-create-update-4pgr2\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.025546 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.118958 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.228638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-utilities\") pod \"0f798a7f-3f79-4bad-81ce-927deac7748c\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.228834 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5cpj\" (UniqueName: \"kubernetes.io/projected/0f798a7f-3f79-4bad-81ce-927deac7748c-kube-api-access-w5cpj\") pod \"0f798a7f-3f79-4bad-81ce-927deac7748c\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.228901 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-catalog-content\") pod \"0f798a7f-3f79-4bad-81ce-927deac7748c\" (UID: \"0f798a7f-3f79-4bad-81ce-927deac7748c\") " Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.229771 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-utilities" (OuterVolumeSpecName: "utilities") pod "0f798a7f-3f79-4bad-81ce-927deac7748c" (UID: "0f798a7f-3f79-4bad-81ce-927deac7748c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.236115 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f798a7f-3f79-4bad-81ce-927deac7748c-kube-api-access-w5cpj" (OuterVolumeSpecName: "kube-api-access-w5cpj") pod "0f798a7f-3f79-4bad-81ce-927deac7748c" (UID: "0f798a7f-3f79-4bad-81ce-927deac7748c"). InnerVolumeSpecName "kube-api-access-w5cpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.330368 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5cpj\" (UniqueName: \"kubernetes.io/projected/0f798a7f-3f79-4bad-81ce-927deac7748c-kube-api-access-w5cpj\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.330400 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.484074 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-4pgr2"] Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.704091 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4pgr2" event={"ID":"eeb30f86-cf93-47e2-8dfa-0bec7d656b74","Type":"ContainerStarted","Data":"dbf03a502ec31eeb1c32c76f40edb6250c7c04cd55512165e350a627fdf8e1b7"} Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.704554 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4pgr2" event={"ID":"eeb30f86-cf93-47e2-8dfa-0bec7d656b74","Type":"ContainerStarted","Data":"2ddf6ca655defe81bb7463fb72db8f2faf5a112847354fe4ff6d98a7bbcf6b41"} Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.705902 4772 generic.go:334] "Generic (PLEG): container finished" podID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerID="c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d" exitCode=0 Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.705956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerDied","Data":"c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d"} Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.705992 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rcww9" event={"ID":"0f798a7f-3f79-4bad-81ce-927deac7748c","Type":"ContainerDied","Data":"968e17dd30db85c3b57c00c0b8a56c178398486e19424f28d377e321bb687c89"} Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.706013 4772 scope.go:117] "RemoveContainer" containerID="c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.706054 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rcww9" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.725141 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-4pgr2" podStartSLOduration=1.725124503 podStartE2EDuration="1.725124503s" podCreationTimestamp="2026-01-27 16:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:19.720464549 +0000 UTC m=+4765.701073677" watchObservedRunningTime="2026-01-27 16:26:19.725124503 +0000 UTC m=+4765.705733601" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.728746 4772 scope.go:117] "RemoveContainer" containerID="7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.745550 4772 scope.go:117] "RemoveContainer" containerID="1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.761545 4772 scope.go:117] "RemoveContainer" containerID="c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d" Jan 27 16:26:19 crc kubenswrapper[4772]: E0127 16:26:19.761945 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d\": container with ID starting with c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d not found: ID does not exist" containerID="c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.761989 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d"} err="failed to get container status \"c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d\": rpc error: code = NotFound desc = could not find container \"c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d\": container with ID starting with c53284a092cb30eb41fcbd2150f2e0435fbbe105e96d6d7b176f53ce99e3588d not found: ID does not exist" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.762017 4772 scope.go:117] "RemoveContainer" containerID="7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e" Jan 27 16:26:19 crc kubenswrapper[4772]: E0127 16:26:19.762332 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e\": container with ID starting with 7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e not found: ID does not exist" containerID="7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.762353 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e"} err="failed to get container status \"7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e\": rpc error: code = NotFound desc = could not find container \"7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e\": container with ID starting with 7fc27289a60fa452c350532a44e942b0d6e7cd80328c4d362e21d2a68688c24e not found: ID does not exist" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.762366 4772 scope.go:117] "RemoveContainer" containerID="1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58" Jan 27 16:26:19 crc kubenswrapper[4772]: E0127 16:26:19.762644 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58\": container with ID starting with 1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58 not found: ID does not exist" containerID="1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58" Jan 27 16:26:19 crc kubenswrapper[4772]: I0127 16:26:19.762682 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58"} err="failed to get container status \"1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58\": rpc error: code = NotFound desc = could not find container \"1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58\": container with ID starting with 1e79f6f181330775c970fd32e9724d3a5685272b702ae330691c90a1a8494c58 not found: ID does not exist" Jan 27 16:26:20 crc kubenswrapper[4772]: I0127 16:26:20.794062 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f798a7f-3f79-4bad-81ce-927deac7748c" (UID: "0f798a7f-3f79-4bad-81ce-927deac7748c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:26:20 crc kubenswrapper[4772]: I0127 16:26:20.851987 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f798a7f-3f79-4bad-81ce-927deac7748c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:20 crc kubenswrapper[4772]: I0127 16:26:20.942775 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rcww9"] Jan 27 16:26:20 crc kubenswrapper[4772]: I0127 16:26:20.949339 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rcww9"] Jan 27 16:26:21 crc kubenswrapper[4772]: I0127 16:26:21.724572 4772 generic.go:334] "Generic (PLEG): container finished" podID="eeb30f86-cf93-47e2-8dfa-0bec7d656b74" containerID="dbf03a502ec31eeb1c32c76f40edb6250c7c04cd55512165e350a627fdf8e1b7" exitCode=0 Jan 27 16:26:21 crc kubenswrapper[4772]: I0127 16:26:21.724689 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4pgr2" event={"ID":"eeb30f86-cf93-47e2-8dfa-0bec7d656b74","Type":"ContainerDied","Data":"dbf03a502ec31eeb1c32c76f40edb6250c7c04cd55512165e350a627fdf8e1b7"} Jan 27 16:26:22 crc kubenswrapper[4772]: I0127 16:26:22.679547 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" path="/var/lib/kubelet/pods/0f798a7f-3f79-4bad-81ce-927deac7748c/volumes" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.157462 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.290664 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-operator-scripts\") pod \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.290708 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2t2c\" (UniqueName: \"kubernetes.io/projected/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-kube-api-access-z2t2c\") pod \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\" (UID: \"eeb30f86-cf93-47e2-8dfa-0bec7d656b74\") " Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.291768 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eeb30f86-cf93-47e2-8dfa-0bec7d656b74" (UID: "eeb30f86-cf93-47e2-8dfa-0bec7d656b74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.297572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-kube-api-access-z2t2c" (OuterVolumeSpecName: "kube-api-access-z2t2c") pod "eeb30f86-cf93-47e2-8dfa-0bec7d656b74" (UID: "eeb30f86-cf93-47e2-8dfa-0bec7d656b74"). InnerVolumeSpecName "kube-api-access-z2t2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.392963 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.393060 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2t2c\" (UniqueName: \"kubernetes.io/projected/eeb30f86-cf93-47e2-8dfa-0bec7d656b74-kube-api-access-z2t2c\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.759691 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-4pgr2" event={"ID":"eeb30f86-cf93-47e2-8dfa-0bec7d656b74","Type":"ContainerDied","Data":"2ddf6ca655defe81bb7463fb72db8f2faf5a112847354fe4ff6d98a7bbcf6b41"} Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.759728 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ddf6ca655defe81bb7463fb72db8f2faf5a112847354fe4ff6d98a7bbcf6b41" Jan 27 16:26:23 crc kubenswrapper[4772]: I0127 16:26:23.759751 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-4pgr2" Jan 27 16:26:25 crc kubenswrapper[4772]: I0127 16:26:25.236228 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-4pgr2"] Jan 27 16:26:25 crc kubenswrapper[4772]: I0127 16:26:25.241998 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-4pgr2"] Jan 27 16:26:26 crc kubenswrapper[4772]: I0127 16:26:26.675265 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb30f86-cf93-47e2-8dfa-0bec7d656b74" path="/var/lib/kubelet/pods/eeb30f86-cf93-47e2-8dfa-0bec7d656b74/volumes" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.727875 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-269mm"] Jan 27 16:26:28 crc kubenswrapper[4772]: E0127 16:26:28.728591 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb30f86-cf93-47e2-8dfa-0bec7d656b74" containerName="mariadb-account-create-update" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.728608 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb30f86-cf93-47e2-8dfa-0bec7d656b74" containerName="mariadb-account-create-update" Jan 27 16:26:28 crc kubenswrapper[4772]: E0127 16:26:28.728627 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="extract-content" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.728635 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="extract-content" Jan 27 16:26:28 crc kubenswrapper[4772]: E0127 16:26:28.728653 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="registry-server" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.728661 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="registry-server" Jan 27 16:26:28 crc kubenswrapper[4772]: E0127 16:26:28.728673 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="extract-utilities" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.728681 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="extract-utilities" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.728851 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f798a7f-3f79-4bad-81ce-927deac7748c" containerName="registry-server" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.728907 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb30f86-cf93-47e2-8dfa-0bec7d656b74" containerName="mariadb-account-create-update" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.729511 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-269mm" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.731852 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.734361 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-269mm"] Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.874634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de29a5f9-d1c3-413b-8e86-77d1e9f10602-operator-scripts\") pod \"root-account-create-update-269mm\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " pod="openstack/root-account-create-update-269mm" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.874736 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcgq\" (UniqueName: \"kubernetes.io/projected/de29a5f9-d1c3-413b-8e86-77d1e9f10602-kube-api-access-qtcgq\") pod \"root-account-create-update-269mm\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " pod="openstack/root-account-create-update-269mm" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.975858 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de29a5f9-d1c3-413b-8e86-77d1e9f10602-operator-scripts\") pod \"root-account-create-update-269mm\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " pod="openstack/root-account-create-update-269mm" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.976016 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcgq\" (UniqueName: \"kubernetes.io/projected/de29a5f9-d1c3-413b-8e86-77d1e9f10602-kube-api-access-qtcgq\") pod \"root-account-create-update-269mm\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " pod="openstack/root-account-create-update-269mm" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.976836 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de29a5f9-d1c3-413b-8e86-77d1e9f10602-operator-scripts\") pod \"root-account-create-update-269mm\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " pod="openstack/root-account-create-update-269mm" Jan 27 16:26:28 crc kubenswrapper[4772]: I0127 16:26:28.994090 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcgq\" (UniqueName: \"kubernetes.io/projected/de29a5f9-d1c3-413b-8e86-77d1e9f10602-kube-api-access-qtcgq\") pod \"root-account-create-update-269mm\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " pod="openstack/root-account-create-update-269mm" Jan 27 16:26:29 crc kubenswrapper[4772]: I0127 16:26:29.057462 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-269mm" Jan 27 16:26:29 crc kubenswrapper[4772]: I0127 16:26:29.452678 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-269mm"] Jan 27 16:26:29 crc kubenswrapper[4772]: W0127 16:26:29.460338 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde29a5f9_d1c3_413b_8e86_77d1e9f10602.slice/crio-f42c60f151ca72489735671502b041bd6301ac421d9bf54bebf0fa9452852292 WatchSource:0}: Error finding container f42c60f151ca72489735671502b041bd6301ac421d9bf54bebf0fa9452852292: Status 404 returned error can't find the container with id f42c60f151ca72489735671502b041bd6301ac421d9bf54bebf0fa9452852292 Jan 27 16:26:29 crc kubenswrapper[4772]: I0127 16:26:29.808281 4772 generic.go:334] "Generic (PLEG): container finished" podID="de29a5f9-d1c3-413b-8e86-77d1e9f10602" containerID="0bdb0516e5ad0fcd11824f097428db46c1768aa20c8111ee97d7a876d3f00649" exitCode=0 Jan 27 16:26:29 crc kubenswrapper[4772]: I0127 16:26:29.808318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-269mm" event={"ID":"de29a5f9-d1c3-413b-8e86-77d1e9f10602","Type":"ContainerDied","Data":"0bdb0516e5ad0fcd11824f097428db46c1768aa20c8111ee97d7a876d3f00649"} Jan 27 16:26:29 crc kubenswrapper[4772]: I0127 16:26:29.808342 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-269mm" event={"ID":"de29a5f9-d1c3-413b-8e86-77d1e9f10602","Type":"ContainerStarted","Data":"f42c60f151ca72489735671502b041bd6301ac421d9bf54bebf0fa9452852292"} Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.154379 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-269mm" Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.209891 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de29a5f9-d1c3-413b-8e86-77d1e9f10602-operator-scripts\") pod \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.210003 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtcgq\" (UniqueName: \"kubernetes.io/projected/de29a5f9-d1c3-413b-8e86-77d1e9f10602-kube-api-access-qtcgq\") pod \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\" (UID: \"de29a5f9-d1c3-413b-8e86-77d1e9f10602\") " Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.210981 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de29a5f9-d1c3-413b-8e86-77d1e9f10602-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de29a5f9-d1c3-413b-8e86-77d1e9f10602" (UID: "de29a5f9-d1c3-413b-8e86-77d1e9f10602"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.217371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de29a5f9-d1c3-413b-8e86-77d1e9f10602-kube-api-access-qtcgq" (OuterVolumeSpecName: "kube-api-access-qtcgq") pod "de29a5f9-d1c3-413b-8e86-77d1e9f10602" (UID: "de29a5f9-d1c3-413b-8e86-77d1e9f10602"). InnerVolumeSpecName "kube-api-access-qtcgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.312298 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtcgq\" (UniqueName: \"kubernetes.io/projected/de29a5f9-d1c3-413b-8e86-77d1e9f10602-kube-api-access-qtcgq\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.312335 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de29a5f9-d1c3-413b-8e86-77d1e9f10602-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.823811 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-269mm" event={"ID":"de29a5f9-d1c3-413b-8e86-77d1e9f10602","Type":"ContainerDied","Data":"f42c60f151ca72489735671502b041bd6301ac421d9bf54bebf0fa9452852292"} Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.823854 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f42c60f151ca72489735671502b041bd6301ac421d9bf54bebf0fa9452852292" Jan 27 16:26:31 crc kubenswrapper[4772]: I0127 16:26:31.823890 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-269mm" Jan 27 16:26:32 crc kubenswrapper[4772]: I0127 16:26:32.832340 4772 generic.go:334] "Generic (PLEG): container finished" podID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerID="69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355" exitCode=0 Jan 27 16:26:32 crc kubenswrapper[4772]: I0127 16:26:32.832390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f38de303-3271-4d8a-b114-4fca1e36c6a3","Type":"ContainerDied","Data":"69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355"} Jan 27 16:26:33 crc kubenswrapper[4772]: I0127 16:26:33.840011 4772 generic.go:334] "Generic (PLEG): container finished" podID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerID="480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb" exitCode=0 Jan 27 16:26:33 crc kubenswrapper[4772]: I0127 16:26:33.840095 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b","Type":"ContainerDied","Data":"480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb"} Jan 27 16:26:33 crc kubenswrapper[4772]: I0127 16:26:33.843883 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f38de303-3271-4d8a-b114-4fca1e36c6a3","Type":"ContainerStarted","Data":"9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3"} Jan 27 16:26:33 crc kubenswrapper[4772]: I0127 16:26:33.844231 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 16:26:33 crc kubenswrapper[4772]: I0127 16:26:33.893491 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.893469513 podStartE2EDuration="36.893469513s" podCreationTimestamp="2026-01-27 16:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:33.88641775 +0000 UTC m=+4779.867026858" watchObservedRunningTime="2026-01-27 16:26:33.893469513 +0000 UTC m=+4779.874078621" Jan 27 16:26:34 crc kubenswrapper[4772]: I0127 16:26:34.852067 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b","Type":"ContainerStarted","Data":"5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291"} Jan 27 16:26:34 crc kubenswrapper[4772]: I0127 16:26:34.853094 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:26:34 crc kubenswrapper[4772]: I0127 16:26:34.882665 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.882644051 podStartE2EDuration="37.882644051s" podCreationTimestamp="2026-01-27 16:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:34.871049407 +0000 UTC m=+4780.851658515" watchObservedRunningTime="2026-01-27 16:26:34.882644051 +0000 UTC m=+4780.863253159" Jan 27 16:26:35 crc kubenswrapper[4772]: I0127 16:26:35.253111 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-269mm"] Jan 27 16:26:35 crc kubenswrapper[4772]: I0127 16:26:35.268039 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-269mm"] Jan 27 16:26:36 crc kubenswrapper[4772]: I0127 16:26:36.674416 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de29a5f9-d1c3-413b-8e86-77d1e9f10602" path="/var/lib/kubelet/pods/de29a5f9-d1c3-413b-8e86-77d1e9f10602/volumes" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.276221 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bxrqf"] Jan 27 16:26:40 crc kubenswrapper[4772]: E0127 16:26:40.276883 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de29a5f9-d1c3-413b-8e86-77d1e9f10602" containerName="mariadb-account-create-update" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.276898 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="de29a5f9-d1c3-413b-8e86-77d1e9f10602" containerName="mariadb-account-create-update" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.277116 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="de29a5f9-d1c3-413b-8e86-77d1e9f10602" containerName="mariadb-account-create-update" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.277748 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.281005 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.284288 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bxrqf"] Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.366819 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb558533-27c2-4249-9beb-e01d5b918c58-operator-scripts\") pod \"root-account-create-update-bxrqf\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.366892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5s5\" (UniqueName: \"kubernetes.io/projected/cb558533-27c2-4249-9beb-e01d5b918c58-kube-api-access-jk5s5\") pod \"root-account-create-update-bxrqf\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.468107 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb558533-27c2-4249-9beb-e01d5b918c58-operator-scripts\") pod \"root-account-create-update-bxrqf\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.468197 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5s5\" (UniqueName: \"kubernetes.io/projected/cb558533-27c2-4249-9beb-e01d5b918c58-kube-api-access-jk5s5\") pod \"root-account-create-update-bxrqf\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.469387 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb558533-27c2-4249-9beb-e01d5b918c58-operator-scripts\") pod \"root-account-create-update-bxrqf\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.490237 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5s5\" (UniqueName: \"kubernetes.io/projected/cb558533-27c2-4249-9beb-e01d5b918c58-kube-api-access-jk5s5\") pod \"root-account-create-update-bxrqf\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.595391 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.836534 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bxrqf"] Jan 27 16:26:40 crc kubenswrapper[4772]: I0127 16:26:40.899323 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bxrqf" event={"ID":"cb558533-27c2-4249-9beb-e01d5b918c58","Type":"ContainerStarted","Data":"e9040036f87513970452b16f9fd192af689be570efd1157cc6b81ecb51d01f63"} Jan 27 16:26:41 crc kubenswrapper[4772]: I0127 16:26:41.910100 4772 generic.go:334] "Generic (PLEG): container finished" podID="cb558533-27c2-4249-9beb-e01d5b918c58" containerID="65777ccce3cb931b879ebb264390f1a957ffbebf2e9446690c38c79d1e3ddb7c" exitCode=0 Jan 27 16:26:41 crc kubenswrapper[4772]: I0127 16:26:41.910459 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bxrqf" event={"ID":"cb558533-27c2-4249-9beb-e01d5b918c58","Type":"ContainerDied","Data":"65777ccce3cb931b879ebb264390f1a957ffbebf2e9446690c38c79d1e3ddb7c"} Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.058031 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.058092 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.058141 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.058814 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3a08a71f69d769f4a6a29d6cef13873c9dceaa6515bc086fcafc82c5f73a041"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.058873 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://f3a08a71f69d769f4a6a29d6cef13873c9dceaa6515bc086fcafc82c5f73a041" gracePeriod=600 Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.921184 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="f3a08a71f69d769f4a6a29d6cef13873c9dceaa6515bc086fcafc82c5f73a041" exitCode=0 Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.921282 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"f3a08a71f69d769f4a6a29d6cef13873c9dceaa6515bc086fcafc82c5f73a041"} Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.921517 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b"} Jan 27 16:26:42 crc kubenswrapper[4772]: I0127 16:26:42.921548 4772 scope.go:117] "RemoveContainer" containerID="8604202f7fe20b38dd6ccc7e97fcf384e30e6ff4cf589a28a42b70c4dab8470d" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.170338 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.221237 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk5s5\" (UniqueName: \"kubernetes.io/projected/cb558533-27c2-4249-9beb-e01d5b918c58-kube-api-access-jk5s5\") pod \"cb558533-27c2-4249-9beb-e01d5b918c58\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.221406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb558533-27c2-4249-9beb-e01d5b918c58-operator-scripts\") pod \"cb558533-27c2-4249-9beb-e01d5b918c58\" (UID: \"cb558533-27c2-4249-9beb-e01d5b918c58\") " Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.222136 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb558533-27c2-4249-9beb-e01d5b918c58-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb558533-27c2-4249-9beb-e01d5b918c58" (UID: "cb558533-27c2-4249-9beb-e01d5b918c58"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.227370 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb558533-27c2-4249-9beb-e01d5b918c58-kube-api-access-jk5s5" (OuterVolumeSpecName: "kube-api-access-jk5s5") pod "cb558533-27c2-4249-9beb-e01d5b918c58" (UID: "cb558533-27c2-4249-9beb-e01d5b918c58"). InnerVolumeSpecName "kube-api-access-jk5s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.322877 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk5s5\" (UniqueName: \"kubernetes.io/projected/cb558533-27c2-4249-9beb-e01d5b918c58-kube-api-access-jk5s5\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.322913 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb558533-27c2-4249-9beb-e01d5b918c58-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.929249 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bxrqf" event={"ID":"cb558533-27c2-4249-9beb-e01d5b918c58","Type":"ContainerDied","Data":"e9040036f87513970452b16f9fd192af689be570efd1157cc6b81ecb51d01f63"} Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.929268 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bxrqf" Jan 27 16:26:43 crc kubenswrapper[4772]: I0127 16:26:43.929290 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9040036f87513970452b16f9fd192af689be570efd1157cc6b81ecb51d01f63" Jan 27 16:26:48 crc kubenswrapper[4772]: I0127 16:26:48.732358 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 16:26:49 crc kubenswrapper[4772]: I0127 16:26:49.195526 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.789041 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-ndkll"] Jan 27 16:26:54 crc kubenswrapper[4772]: E0127 16:26:54.789830 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb558533-27c2-4249-9beb-e01d5b918c58" containerName="mariadb-account-create-update" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.789841 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb558533-27c2-4249-9beb-e01d5b918c58" containerName="mariadb-account-create-update" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.789983 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb558533-27c2-4249-9beb-e01d5b918c58" containerName="mariadb-account-create-update" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.790786 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.798063 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-ndkll"] Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.889775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m72l\" (UniqueName: \"kubernetes.io/projected/51a5db5c-8de5-441d-a8e9-7c07acc7df31-kube-api-access-7m72l\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.889818 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-config\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.889851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.991362 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m72l\" (UniqueName: \"kubernetes.io/projected/51a5db5c-8de5-441d-a8e9-7c07acc7df31-kube-api-access-7m72l\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.991405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-config\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.991438 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.992318 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:54 crc kubenswrapper[4772]: I0127 16:26:54.992369 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-config\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:55 crc kubenswrapper[4772]: I0127 16:26:55.013651 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m72l\" (UniqueName: \"kubernetes.io/projected/51a5db5c-8de5-441d-a8e9-7c07acc7df31-kube-api-access-7m72l\") pod \"dnsmasq-dns-5b7946d7b9-ndkll\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:55 crc kubenswrapper[4772]: I0127 16:26:55.109085 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:55 crc kubenswrapper[4772]: I0127 16:26:55.509300 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:26:55 crc kubenswrapper[4772]: I0127 16:26:55.555383 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-ndkll"] Jan 27 16:26:55 crc kubenswrapper[4772]: W0127 16:26:55.556839 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a5db5c_8de5_441d_a8e9_7c07acc7df31.slice/crio-cd506cf5289047aba524d3325375f9767877e5e9af047a505f68883c25d7ad72 WatchSource:0}: Error finding container cd506cf5289047aba524d3325375f9767877e5e9af047a505f68883c25d7ad72: Status 404 returned error can't find the container with id cd506cf5289047aba524d3325375f9767877e5e9af047a505f68883c25d7ad72 Jan 27 16:26:56 crc kubenswrapper[4772]: I0127 16:26:56.034871 4772 generic.go:334] "Generic (PLEG): container finished" podID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerID="2b48ddd2c1b4ddcb3e7d2673318ce52a0b89618a27b83de04520bd8160030f43" exitCode=0 Jan 27 16:26:56 crc kubenswrapper[4772]: I0127 16:26:56.034989 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" event={"ID":"51a5db5c-8de5-441d-a8e9-7c07acc7df31","Type":"ContainerDied","Data":"2b48ddd2c1b4ddcb3e7d2673318ce52a0b89618a27b83de04520bd8160030f43"} Jan 27 16:26:56 crc kubenswrapper[4772]: I0127 16:26:56.035157 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" event={"ID":"51a5db5c-8de5-441d-a8e9-7c07acc7df31","Type":"ContainerStarted","Data":"cd506cf5289047aba524d3325375f9767877e5e9af047a505f68883c25d7ad72"} Jan 27 16:26:56 crc kubenswrapper[4772]: I0127 16:26:56.122583 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:26:57 crc kubenswrapper[4772]: I0127 16:26:57.044656 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" event={"ID":"51a5db5c-8de5-441d-a8e9-7c07acc7df31","Type":"ContainerStarted","Data":"a355488297e660433fbaef5e701f58797760e94b791d017a807af73346bc3756"} Jan 27 16:26:57 crc kubenswrapper[4772]: I0127 16:26:57.045992 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:26:57 crc kubenswrapper[4772]: I0127 16:26:57.069378 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" podStartSLOduration=3.069359651 podStartE2EDuration="3.069359651s" podCreationTimestamp="2026-01-27 16:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:26:57.069098184 +0000 UTC m=+4803.049707312" watchObservedRunningTime="2026-01-27 16:26:57.069359651 +0000 UTC m=+4803.049968749" Jan 27 16:26:57 crc kubenswrapper[4772]: I0127 16:26:57.501250 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="rabbitmq" containerID="cri-o://9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3" gracePeriod=604799 Jan 27 16:26:58 crc kubenswrapper[4772]: I0127 16:26:58.001246 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="rabbitmq" containerID="cri-o://5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291" gracePeriod=604799 Jan 27 16:26:58 crc kubenswrapper[4772]: I0127 16:26:58.730794 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.235:5672: connect: connection refused" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.193464 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.236:5672: connect: connection refused" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.715104 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jjgpk"] Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.720785 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.739923 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jjgpk"] Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.763310 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-catalog-content\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.763407 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-utilities\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.763524 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpgqq\" (UniqueName: \"kubernetes.io/projected/b2025168-9cc0-417c-8d3b-1d336447a3ff-kube-api-access-gpgqq\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.864448 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpgqq\" (UniqueName: \"kubernetes.io/projected/b2025168-9cc0-417c-8d3b-1d336447a3ff-kube-api-access-gpgqq\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.864519 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-catalog-content\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.864547 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-utilities\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.864983 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-utilities\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.865280 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-catalog-content\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:26:59 crc kubenswrapper[4772]: I0127 16:26:59.886099 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpgqq\" (UniqueName: \"kubernetes.io/projected/b2025168-9cc0-417c-8d3b-1d336447a3ff-kube-api-access-gpgqq\") pod \"community-operators-jjgpk\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:00 crc kubenswrapper[4772]: I0127 16:27:00.045153 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:00 crc kubenswrapper[4772]: I0127 16:27:00.542813 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jjgpk"] Jan 27 16:27:01 crc kubenswrapper[4772]: I0127 16:27:01.083864 4772 generic.go:334] "Generic (PLEG): container finished" podID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerID="9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0" exitCode=0 Jan 27 16:27:01 crc kubenswrapper[4772]: I0127 16:27:01.083916 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerDied","Data":"9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0"} Jan 27 16:27:01 crc kubenswrapper[4772]: I0127 16:27:01.083956 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerStarted","Data":"aaf303dc4f995cb6bff56fd0049dc32fd5acf74cce10552ba6fbab1b994a80c7"} Jan 27 16:27:02 crc kubenswrapper[4772]: I0127 16:27:02.093546 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerStarted","Data":"020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7"} Jan 27 16:27:03 crc kubenswrapper[4772]: I0127 16:27:03.100214 4772 generic.go:334] "Generic (PLEG): container finished" podID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerID="020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7" exitCode=0 Jan 27 16:27:03 crc kubenswrapper[4772]: I0127 16:27:03.100267 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerDied","Data":"020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7"} Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.047796 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.113184 4772 generic.go:334] "Generic (PLEG): container finished" podID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerID="9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3" exitCode=0 Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.113270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f38de303-3271-4d8a-b114-4fca1e36c6a3","Type":"ContainerDied","Data":"9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3"} Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.113313 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f38de303-3271-4d8a-b114-4fca1e36c6a3","Type":"ContainerDied","Data":"10715889097bdfd0b2f4c8a7bc95c59af267c1bbc009f6e01130ee1ccb028c38"} Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.113311 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.113371 4772 scope.go:117] "RemoveContainer" containerID="9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.118292 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerStarted","Data":"647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf"} Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.126546 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-plugins-conf\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.126612 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-plugins\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.126802 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.126924 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f38de303-3271-4d8a-b114-4fca1e36c6a3-erlang-cookie-secret\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.126959 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-erlang-cookie\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.126989 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj9fs\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-kube-api-access-hj9fs\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.127032 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-server-conf\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.127086 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f38de303-3271-4d8a-b114-4fca1e36c6a3-pod-info\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.127108 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-confd\") pod \"f38de303-3271-4d8a-b114-4fca1e36c6a3\" (UID: \"f38de303-3271-4d8a-b114-4fca1e36c6a3\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.128208 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.128666 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.129339 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.134814 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38de303-3271-4d8a-b114-4fca1e36c6a3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.134890 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-kube-api-access-hj9fs" (OuterVolumeSpecName: "kube-api-access-hj9fs") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "kube-api-access-hj9fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.144225 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb" (OuterVolumeSpecName: "persistence") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "pvc-f4d1f856-0902-41db-b052-d29ccd2349fb". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.149169 4772 scope.go:117] "RemoveContainer" containerID="69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.149206 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f38de303-3271-4d8a-b114-4fca1e36c6a3-pod-info" (OuterVolumeSpecName: "pod-info") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.152819 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jjgpk" podStartSLOduration=2.749817112 podStartE2EDuration="5.152803326s" podCreationTimestamp="2026-01-27 16:26:59 +0000 UTC" firstStartedPulling="2026-01-27 16:27:01.087527085 +0000 UTC m=+4807.068136183" lastFinishedPulling="2026-01-27 16:27:03.490513289 +0000 UTC m=+4809.471122397" observedRunningTime="2026-01-27 16:27:04.144440225 +0000 UTC m=+4810.125049333" watchObservedRunningTime="2026-01-27 16:27:04.152803326 +0000 UTC m=+4810.133412424" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.172822 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-server-conf" (OuterVolumeSpecName: "server-conf") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228443 4772 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f38de303-3271-4d8a-b114-4fca1e36c6a3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228476 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228488 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj9fs\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-kube-api-access-hj9fs\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228497 4772 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228505 4772 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f38de303-3271-4d8a-b114-4fca1e36c6a3-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228514 4772 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f38de303-3271-4d8a-b114-4fca1e36c6a3-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228522 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.228551 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") on node \"crc\" " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.231429 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f38de303-3271-4d8a-b114-4fca1e36c6a3" (UID: "f38de303-3271-4d8a-b114-4fca1e36c6a3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.247803 4772 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.247963 4772 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f4d1f856-0902-41db-b052-d29ccd2349fb" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb") on node "crc" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.286106 4772 scope.go:117] "RemoveContainer" containerID="9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3" Jan 27 16:27:04 crc kubenswrapper[4772]: E0127 16:27:04.286518 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3\": container with ID starting with 9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3 not found: ID does not exist" containerID="9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.286572 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3"} err="failed to get container status \"9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3\": rpc error: code = NotFound desc = could not find container \"9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3\": container with ID starting with 9b0305c5b89b648ee767009d14d68e95e3436ecb734d1416e97e7782000adad3 not found: ID does not exist" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.286653 4772 scope.go:117] "RemoveContainer" containerID="69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355" Jan 27 16:27:04 crc kubenswrapper[4772]: E0127 16:27:04.286986 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355\": container with ID starting with 69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355 not found: ID does not exist" containerID="69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.287036 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355"} err="failed to get container status \"69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355\": rpc error: code = NotFound desc = could not find container \"69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355\": container with ID starting with 69ba34cf36586fe5c71196b7621cd2b9ba30a359ba359f88ecfede8d86e1b355 not found: ID does not exist" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.330559 4772 reconciler_common.go:293] "Volume detached for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.330610 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f38de303-3271-4d8a-b114-4fca1e36c6a3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.448856 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.454735 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.477737 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:27:04 crc kubenswrapper[4772]: E0127 16:27:04.478104 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="setup-container" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.478126 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="setup-container" Jan 27 16:27:04 crc kubenswrapper[4772]: E0127 16:27:04.478150 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="rabbitmq" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.478158 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="rabbitmq" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.478373 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" containerName="rabbitmq" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.479268 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.481429 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.481685 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.481875 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.482018 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kn7v7" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.484879 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.499877 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.534931 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f3b5f224-602e-454a-b35e-2e55160016b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.535265 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.535379 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f3b5f224-602e-454a-b35e-2e55160016b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.537234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.537439 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f3b5f224-602e-454a-b35e-2e55160016b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.537715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f3b5f224-602e-454a-b35e-2e55160016b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.537945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.538050 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.538160 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4qzb\" (UniqueName: \"kubernetes.io/projected/f3b5f224-602e-454a-b35e-2e55160016b5-kube-api-access-h4qzb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.579639 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639514 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-server-conf\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639585 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-plugins\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639639 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qc4j\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-kube-api-access-8qc4j\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639659 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-plugins-conf\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639794 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639831 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-erlang-cookie-secret\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639853 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-confd\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639896 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-pod-info\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.639927 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-erlang-cookie\") pod \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\" (UID: \"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b\") " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.640115 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f3b5f224-602e-454a-b35e-2e55160016b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.640141 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.640161 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f3b5f224-602e-454a-b35e-2e55160016b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.640870 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.640940 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.640985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641012 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641267 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f3b5f224-602e-454a-b35e-2e55160016b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641337 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641433 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f3b5f224-602e-454a-b35e-2e55160016b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f3b5f224-602e-454a-b35e-2e55160016b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641500 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641518 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4qzb\" (UniqueName: \"kubernetes.io/projected/f3b5f224-602e-454a-b35e-2e55160016b5-kube-api-access-h4qzb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641599 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641609 4772 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.641618 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.642150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.645437 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-pod-info" (OuterVolumeSpecName: "pod-info") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.646028 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f3b5f224-602e-454a-b35e-2e55160016b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.647720 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-kube-api-access-8qc4j" (OuterVolumeSpecName: "kube-api-access-8qc4j") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "kube-api-access-8qc4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.648583 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f3b5f224-602e-454a-b35e-2e55160016b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.648678 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.648703 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.648712 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/36a183de2c7abb5d8abee5f0c83592d4960d3f7dbd03e4d4afd32924fe238d72/globalmount\"" pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.650787 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f3b5f224-602e-454a-b35e-2e55160016b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.657651 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f3b5f224-602e-454a-b35e-2e55160016b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.662861 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-server-conf" (OuterVolumeSpecName: "server-conf") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.664784 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899" (OuterVolumeSpecName: "persistence") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.670909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4qzb\" (UniqueName: \"kubernetes.io/projected/f3b5f224-602e-454a-b35e-2e55160016b5-kube-api-access-h4qzb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.693128 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38de303-3271-4d8a-b114-4fca1e36c6a3" path="/var/lib/kubelet/pods/f38de303-3271-4d8a-b114-4fca1e36c6a3/volumes" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.708457 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f4d1f856-0902-41db-b052-d29ccd2349fb\") pod \"rabbitmq-server-0\" (UID: \"f3b5f224-602e-454a-b35e-2e55160016b5\") " pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.743135 4772 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.743458 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qc4j\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-kube-api-access-8qc4j\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.743608 4772 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") on node \"crc\" " Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.743708 4772 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.743809 4772 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.747445 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" (UID: "18fc2383-1b4e-43c5-b6cb-8aa40600cf7b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.761746 4772 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.761936 4772 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899") on node "crc" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.806261 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.845684 4772 reconciler_common.go:293] "Volume detached for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:04 crc kubenswrapper[4772]: I0127 16:27:04.845718 4772 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.111077 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.132604 4772 generic.go:334] "Generic (PLEG): container finished" podID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerID="5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291" exitCode=0 Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.133522 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.136379 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b","Type":"ContainerDied","Data":"5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291"} Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.136457 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"18fc2383-1b4e-43c5-b6cb-8aa40600cf7b","Type":"ContainerDied","Data":"b96f230c3d38a86522fef8eccf06bafd1c3858ca3dbe42112ea60613f2b942f3"} Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.136496 4772 scope.go:117] "RemoveContainer" containerID="5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.169284 4772 scope.go:117] "RemoveContainer" containerID="480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.188913 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mmzht"] Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.189186 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerName="dnsmasq-dns" containerID="cri-o://9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94" gracePeriod=10 Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.200229 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.207967 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.212251 4772 scope.go:117] "RemoveContainer" containerID="5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291" Jan 27 16:27:05 crc kubenswrapper[4772]: E0127 16:27:05.212836 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291\": container with ID starting with 5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291 not found: ID does not exist" containerID="5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.212903 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291"} err="failed to get container status \"5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291\": rpc error: code = NotFound desc = could not find container \"5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291\": container with ID starting with 5308f6aebce7e906d821045f50b393cfaba5255e05a4b74e645d3d680c330291 not found: ID does not exist" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.212946 4772 scope.go:117] "RemoveContainer" containerID="480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb" Jan 27 16:27:05 crc kubenswrapper[4772]: E0127 16:27:05.213378 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb\": container with ID starting with 480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb not found: ID does not exist" containerID="480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.213412 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb"} err="failed to get container status \"480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb\": rpc error: code = NotFound desc = could not find container \"480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb\": container with ID starting with 480ea6652ba12d615e1f6b8fb144377c490339e8e4bed427eade476083ce2cbb not found: ID does not exist" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.238454 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:27:05 crc kubenswrapper[4772]: E0127 16:27:05.238800 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="rabbitmq" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.238820 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="rabbitmq" Jan 27 16:27:05 crc kubenswrapper[4772]: E0127 16:27:05.238840 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="setup-container" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.238848 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="setup-container" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.239034 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" containerName="rabbitmq" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.243146 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.247830 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.249674 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.249741 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.250016 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-554gz" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.250046 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.255598 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 16:27:05 crc kubenswrapper[4772]: W0127 16:27:05.266772 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b5f224_602e_454a_b35e_2e55160016b5.slice/crio-735e34770b8fa85917d67cdf87dff85c464a64682387bc5b15840a99a4ae68bd WatchSource:0}: Error finding container 735e34770b8fa85917d67cdf87dff85c464a64682387bc5b15840a99a4ae68bd: Status 404 returned error can't find the container with id 735e34770b8fa85917d67cdf87dff85c464a64682387bc5b15840a99a4ae68bd Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.268398 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354230 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354296 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354319 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354341 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354383 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7dwn\" (UniqueName: \"kubernetes.io/projected/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-kube-api-access-d7dwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354433 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354472 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354533 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.354565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459206 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459284 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459341 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459422 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459479 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459506 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459531 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459572 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7dwn\" (UniqueName: \"kubernetes.io/projected/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-kube-api-access-d7dwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.459963 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.460723 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.461531 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.461601 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.464322 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.464347 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.465030 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.465069 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/74f6c4054e7c826304958ef416594be6d4b6260f90a6b43d068948e9c0dc0fa0/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.465228 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.475765 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7dwn\" (UniqueName: \"kubernetes.io/projected/1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88-kube-api-access-d7dwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.512400 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b481002b-0e4c-443b-9281-7c1ac6b1e899\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.594443 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.722260 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.865644 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-config\") pod \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.865709 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-dns-svc\") pod \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.865748 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqg6h\" (UniqueName: \"kubernetes.io/projected/bb8b7780-142e-4fd6-967f-a42e112a0b2e-kube-api-access-cqg6h\") pod \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\" (UID: \"bb8b7780-142e-4fd6-967f-a42e112a0b2e\") " Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.874387 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8b7780-142e-4fd6-967f-a42e112a0b2e-kube-api-access-cqg6h" (OuterVolumeSpecName: "kube-api-access-cqg6h") pod "bb8b7780-142e-4fd6-967f-a42e112a0b2e" (UID: "bb8b7780-142e-4fd6-967f-a42e112a0b2e"). InnerVolumeSpecName "kube-api-access-cqg6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.901413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-config" (OuterVolumeSpecName: "config") pod "bb8b7780-142e-4fd6-967f-a42e112a0b2e" (UID: "bb8b7780-142e-4fd6-967f-a42e112a0b2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.910527 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb8b7780-142e-4fd6-967f-a42e112a0b2e" (UID: "bb8b7780-142e-4fd6-967f-a42e112a0b2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.967773 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.968060 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb8b7780-142e-4fd6-967f-a42e112a0b2e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:05 crc kubenswrapper[4772]: I0127 16:27:05.968145 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqg6h\" (UniqueName: \"kubernetes.io/projected/bb8b7780-142e-4fd6-967f-a42e112a0b2e-kube-api-access-cqg6h\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.071497 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 16:27:06 crc kubenswrapper[4772]: W0127 16:27:06.086815 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c2904f0_5ba8_4bb4_9952_ca1ee06a4d88.slice/crio-e0c5cacd55009b64e1a8abc6f9b74ae8e9045b3dca625c9f4afd5c099f239b3b WatchSource:0}: Error finding container e0c5cacd55009b64e1a8abc6f9b74ae8e9045b3dca625c9f4afd5c099f239b3b: Status 404 returned error can't find the container with id e0c5cacd55009b64e1a8abc6f9b74ae8e9045b3dca625c9f4afd5c099f239b3b Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.140200 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3b5f224-602e-454a-b35e-2e55160016b5","Type":"ContainerStarted","Data":"735e34770b8fa85917d67cdf87dff85c464a64682387bc5b15840a99a4ae68bd"} Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.141125 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88","Type":"ContainerStarted","Data":"e0c5cacd55009b64e1a8abc6f9b74ae8e9045b3dca625c9f4afd5c099f239b3b"} Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.142623 4772 generic.go:334] "Generic (PLEG): container finished" podID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerID="9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94" exitCode=0 Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.142647 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" event={"ID":"bb8b7780-142e-4fd6-967f-a42e112a0b2e","Type":"ContainerDied","Data":"9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94"} Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.142676 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.142688 4772 scope.go:117] "RemoveContainer" containerID="9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.142675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-mmzht" event={"ID":"bb8b7780-142e-4fd6-967f-a42e112a0b2e","Type":"ContainerDied","Data":"78ea743dbcac291e88b556c9f353cf72b6797716babc1be9562796c00fc48cc9"} Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.177339 4772 scope.go:117] "RemoveContainer" containerID="16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.192044 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mmzht"] Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.205651 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-mmzht"] Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.320244 4772 scope.go:117] "RemoveContainer" containerID="9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94" Jan 27 16:27:06 crc kubenswrapper[4772]: E0127 16:27:06.320640 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94\": container with ID starting with 9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94 not found: ID does not exist" containerID="9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.320677 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94"} err="failed to get container status \"9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94\": rpc error: code = NotFound desc = could not find container \"9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94\": container with ID starting with 9c160ab266039b95d6f27552251a6a39ff7c08fcb4346eec1f5f3d5a708d9f94 not found: ID does not exist" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.320700 4772 scope.go:117] "RemoveContainer" containerID="16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5" Jan 27 16:27:06 crc kubenswrapper[4772]: E0127 16:27:06.321052 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5\": container with ID starting with 16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5 not found: ID does not exist" containerID="16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.321077 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5"} err="failed to get container status \"16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5\": rpc error: code = NotFound desc = could not find container \"16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5\": container with ID starting with 16ae7072b9a91932c05ae95c0de53fbb1486c6e51d07f50b89c0848be4b668b5 not found: ID does not exist" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.672831 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18fc2383-1b4e-43c5-b6cb-8aa40600cf7b" path="/var/lib/kubelet/pods/18fc2383-1b4e-43c5-b6cb-8aa40600cf7b/volumes" Jan 27 16:27:06 crc kubenswrapper[4772]: I0127 16:27:06.673479 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" path="/var/lib/kubelet/pods/bb8b7780-142e-4fd6-967f-a42e112a0b2e/volumes" Jan 27 16:27:07 crc kubenswrapper[4772]: I0127 16:27:07.152426 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3b5f224-602e-454a-b35e-2e55160016b5","Type":"ContainerStarted","Data":"6d2656c9e69a02c468dda617caf117c0c2d45cfe2e8706e5cd1d82cc528442ff"} Jan 27 16:27:08 crc kubenswrapper[4772]: I0127 16:27:08.162214 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88","Type":"ContainerStarted","Data":"0980d84a86e629e38b050c4e0e81922329b893eae78161f17488b28b814908bb"} Jan 27 16:27:10 crc kubenswrapper[4772]: I0127 16:27:10.045317 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:10 crc kubenswrapper[4772]: I0127 16:27:10.045534 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:10 crc kubenswrapper[4772]: I0127 16:27:10.096538 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:10 crc kubenswrapper[4772]: I0127 16:27:10.217086 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:10 crc kubenswrapper[4772]: I0127 16:27:10.329146 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jjgpk"] Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.192630 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jjgpk" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="registry-server" containerID="cri-o://647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf" gracePeriod=2 Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.570582 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.692900 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpgqq\" (UniqueName: \"kubernetes.io/projected/b2025168-9cc0-417c-8d3b-1d336447a3ff-kube-api-access-gpgqq\") pod \"b2025168-9cc0-417c-8d3b-1d336447a3ff\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.692999 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-utilities\") pod \"b2025168-9cc0-417c-8d3b-1d336447a3ff\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.693096 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-catalog-content\") pod \"b2025168-9cc0-417c-8d3b-1d336447a3ff\" (UID: \"b2025168-9cc0-417c-8d3b-1d336447a3ff\") " Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.693875 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-utilities" (OuterVolumeSpecName: "utilities") pod "b2025168-9cc0-417c-8d3b-1d336447a3ff" (UID: "b2025168-9cc0-417c-8d3b-1d336447a3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.697992 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2025168-9cc0-417c-8d3b-1d336447a3ff-kube-api-access-gpgqq" (OuterVolumeSpecName: "kube-api-access-gpgqq") pod "b2025168-9cc0-417c-8d3b-1d336447a3ff" (UID: "b2025168-9cc0-417c-8d3b-1d336447a3ff"). InnerVolumeSpecName "kube-api-access-gpgqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.795789 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpgqq\" (UniqueName: \"kubernetes.io/projected/b2025168-9cc0-417c-8d3b-1d336447a3ff-kube-api-access-gpgqq\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:12 crc kubenswrapper[4772]: I0127 16:27:12.795834 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.201081 4772 generic.go:334] "Generic (PLEG): container finished" podID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerID="647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf" exitCode=0 Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.201131 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerDied","Data":"647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf"} Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.201188 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jjgpk" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.202473 4772 scope.go:117] "RemoveContainer" containerID="647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.202364 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jjgpk" event={"ID":"b2025168-9cc0-417c-8d3b-1d336447a3ff","Type":"ContainerDied","Data":"aaf303dc4f995cb6bff56fd0049dc32fd5acf74cce10552ba6fbab1b994a80c7"} Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.222146 4772 scope.go:117] "RemoveContainer" containerID="020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.248116 4772 scope.go:117] "RemoveContainer" containerID="9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.290462 4772 scope.go:117] "RemoveContainer" containerID="647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf" Jan 27 16:27:13 crc kubenswrapper[4772]: E0127 16:27:13.290863 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf\": container with ID starting with 647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf not found: ID does not exist" containerID="647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.290898 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf"} err="failed to get container status \"647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf\": rpc error: code = NotFound desc = could not find container \"647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf\": container with ID starting with 647af5829c88ac17630d560d584a1467c6061b43b60545e27fdaa4193667e8cf not found: ID does not exist" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.290917 4772 scope.go:117] "RemoveContainer" containerID="020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7" Jan 27 16:27:13 crc kubenswrapper[4772]: E0127 16:27:13.291302 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7\": container with ID starting with 020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7 not found: ID does not exist" containerID="020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.291362 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7"} err="failed to get container status \"020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7\": rpc error: code = NotFound desc = could not find container \"020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7\": container with ID starting with 020319dacd513bbd957b8817f14f5c128e22f4c1f10468a190b399e9005621c7 not found: ID does not exist" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.291396 4772 scope.go:117] "RemoveContainer" containerID="9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0" Jan 27 16:27:13 crc kubenswrapper[4772]: E0127 16:27:13.291725 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0\": container with ID starting with 9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0 not found: ID does not exist" containerID="9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.291757 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0"} err="failed to get container status \"9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0\": rpc error: code = NotFound desc = could not find container \"9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0\": container with ID starting with 9a45c145ee6112b35c7f85262dd11f2bab997eedc4836c666662e77f41944aa0 not found: ID does not exist" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.559656 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2025168-9cc0-417c-8d3b-1d336447a3ff" (UID: "b2025168-9cc0-417c-8d3b-1d336447a3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.610525 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2025168-9cc0-417c-8d3b-1d336447a3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.843861 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jjgpk"] Jan 27 16:27:13 crc kubenswrapper[4772]: I0127 16:27:13.851948 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jjgpk"] Jan 27 16:27:14 crc kubenswrapper[4772]: I0127 16:27:14.674308 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" path="/var/lib/kubelet/pods/b2025168-9cc0-417c-8d3b-1d336447a3ff/volumes" Jan 27 16:27:39 crc kubenswrapper[4772]: I0127 16:27:39.405936 4772 generic.go:334] "Generic (PLEG): container finished" podID="f3b5f224-602e-454a-b35e-2e55160016b5" containerID="6d2656c9e69a02c468dda617caf117c0c2d45cfe2e8706e5cd1d82cc528442ff" exitCode=0 Jan 27 16:27:39 crc kubenswrapper[4772]: I0127 16:27:39.406031 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3b5f224-602e-454a-b35e-2e55160016b5","Type":"ContainerDied","Data":"6d2656c9e69a02c468dda617caf117c0c2d45cfe2e8706e5cd1d82cc528442ff"} Jan 27 16:27:39 crc kubenswrapper[4772]: I0127 16:27:39.409005 4772 generic.go:334] "Generic (PLEG): container finished" podID="1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88" containerID="0980d84a86e629e38b050c4e0e81922329b893eae78161f17488b28b814908bb" exitCode=0 Jan 27 16:27:39 crc kubenswrapper[4772]: I0127 16:27:39.409049 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88","Type":"ContainerDied","Data":"0980d84a86e629e38b050c4e0e81922329b893eae78161f17488b28b814908bb"} Jan 27 16:27:40 crc kubenswrapper[4772]: I0127 16:27:40.418006 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f3b5f224-602e-454a-b35e-2e55160016b5","Type":"ContainerStarted","Data":"687181ce3ac97e8e7c2b77ea2674ed00ed45a1ee0b30edbe6112a0965279b62a"} Jan 27 16:27:40 crc kubenswrapper[4772]: I0127 16:27:40.418591 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 16:27:40 crc kubenswrapper[4772]: I0127 16:27:40.420284 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88","Type":"ContainerStarted","Data":"ecc8aa55c8175131ece32925bb28b99bd308ecf7369edc88c3c039d474fb4cc3"} Jan 27 16:27:40 crc kubenswrapper[4772]: I0127 16:27:40.420512 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:27:40 crc kubenswrapper[4772]: I0127 16:27:40.448010 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.447989759 podStartE2EDuration="36.447989759s" podCreationTimestamp="2026-01-27 16:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:27:40.440655808 +0000 UTC m=+4846.421264926" watchObservedRunningTime="2026-01-27 16:27:40.447989759 +0000 UTC m=+4846.428598857" Jan 27 16:27:40 crc kubenswrapper[4772]: I0127 16:27:40.466575 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.466556794 podStartE2EDuration="35.466556794s" podCreationTimestamp="2026-01-27 16:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:27:40.460825959 +0000 UTC m=+4846.441435057" watchObservedRunningTime="2026-01-27 16:27:40.466556794 +0000 UTC m=+4846.447165882" Jan 27 16:27:54 crc kubenswrapper[4772]: I0127 16:27:54.810074 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 16:27:55 crc kubenswrapper[4772]: I0127 16:27:55.598473 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.523263 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 27 16:28:06 crc kubenswrapper[4772]: E0127 16:28:06.524207 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="registry-server" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524225 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="registry-server" Jan 27 16:28:06 crc kubenswrapper[4772]: E0127 16:28:06.524242 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="extract-content" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524253 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="extract-content" Jan 27 16:28:06 crc kubenswrapper[4772]: E0127 16:28:06.524277 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerName="init" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524286 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerName="init" Jan 27 16:28:06 crc kubenswrapper[4772]: E0127 16:28:06.524301 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="extract-utilities" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524311 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="extract-utilities" Jan 27 16:28:06 crc kubenswrapper[4772]: E0127 16:28:06.524340 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerName="dnsmasq-dns" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524348 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerName="dnsmasq-dns" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524535 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8b7780-142e-4fd6-967f-a42e112a0b2e" containerName="dnsmasq-dns" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.524566 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2025168-9cc0-417c-8d3b-1d336447a3ff" containerName="registry-server" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.525480 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.533350 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jd6dc" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.536318 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.623394 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdcs\" (UniqueName: \"kubernetes.io/projected/96c99cc4-7be7-49a2-bbc4-a16c3698e291-kube-api-access-hpdcs\") pod \"mariadb-client\" (UID: \"96c99cc4-7be7-49a2-bbc4-a16c3698e291\") " pod="openstack/mariadb-client" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.724868 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdcs\" (UniqueName: \"kubernetes.io/projected/96c99cc4-7be7-49a2-bbc4-a16c3698e291-kube-api-access-hpdcs\") pod \"mariadb-client\" (UID: \"96c99cc4-7be7-49a2-bbc4-a16c3698e291\") " pod="openstack/mariadb-client" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.746719 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdcs\" (UniqueName: \"kubernetes.io/projected/96c99cc4-7be7-49a2-bbc4-a16c3698e291-kube-api-access-hpdcs\") pod \"mariadb-client\" (UID: \"96c99cc4-7be7-49a2-bbc4-a16c3698e291\") " pod="openstack/mariadb-client" Jan 27 16:28:06 crc kubenswrapper[4772]: I0127 16:28:06.848583 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:28:07 crc kubenswrapper[4772]: I0127 16:28:07.403370 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:28:07 crc kubenswrapper[4772]: I0127 16:28:07.617681 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"96c99cc4-7be7-49a2-bbc4-a16c3698e291","Type":"ContainerStarted","Data":"e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f"} Jan 27 16:28:07 crc kubenswrapper[4772]: I0127 16:28:07.617724 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"96c99cc4-7be7-49a2-bbc4-a16c3698e291","Type":"ContainerStarted","Data":"9361cb30d4628d107a493b7c099f845f78b344e58ebf24b88567206da733e5ba"} Jan 27 16:28:07 crc kubenswrapper[4772]: I0127 16:28:07.630775 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.630755239 podStartE2EDuration="1.630755239s" podCreationTimestamp="2026-01-27 16:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:28:07.628401601 +0000 UTC m=+4873.609010699" watchObservedRunningTime="2026-01-27 16:28:07.630755239 +0000 UTC m=+4873.611364337" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.095448 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.096383 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="96c99cc4-7be7-49a2-bbc4-a16c3698e291" containerName="mariadb-client" containerID="cri-o://e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f" gracePeriod=30 Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.652921 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.739692 4772 generic.go:334] "Generic (PLEG): container finished" podID="96c99cc4-7be7-49a2-bbc4-a16c3698e291" containerID="e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f" exitCode=143 Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.739733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"96c99cc4-7be7-49a2-bbc4-a16c3698e291","Type":"ContainerDied","Data":"e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f"} Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.739738 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.739757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"96c99cc4-7be7-49a2-bbc4-a16c3698e291","Type":"ContainerDied","Data":"9361cb30d4628d107a493b7c099f845f78b344e58ebf24b88567206da733e5ba"} Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.739771 4772 scope.go:117] "RemoveContainer" containerID="e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.762844 4772 scope.go:117] "RemoveContainer" containerID="e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f" Jan 27 16:28:22 crc kubenswrapper[4772]: E0127 16:28:22.763287 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f\": container with ID starting with e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f not found: ID does not exist" containerID="e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.763327 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f"} err="failed to get container status \"e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f\": rpc error: code = NotFound desc = could not find container \"e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f\": container with ID starting with e2528fc25164a9d3257d0c9f90fb442cd95239856af489f43f185023ba775a4f not found: ID does not exist" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.778102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpdcs\" (UniqueName: \"kubernetes.io/projected/96c99cc4-7be7-49a2-bbc4-a16c3698e291-kube-api-access-hpdcs\") pod \"96c99cc4-7be7-49a2-bbc4-a16c3698e291\" (UID: \"96c99cc4-7be7-49a2-bbc4-a16c3698e291\") " Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.783428 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96c99cc4-7be7-49a2-bbc4-a16c3698e291-kube-api-access-hpdcs" (OuterVolumeSpecName: "kube-api-access-hpdcs") pod "96c99cc4-7be7-49a2-bbc4-a16c3698e291" (UID: "96c99cc4-7be7-49a2-bbc4-a16c3698e291"). InnerVolumeSpecName "kube-api-access-hpdcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:28:22 crc kubenswrapper[4772]: I0127 16:28:22.879720 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpdcs\" (UniqueName: \"kubernetes.io/projected/96c99cc4-7be7-49a2-bbc4-a16c3698e291-kube-api-access-hpdcs\") on node \"crc\" DevicePath \"\"" Jan 27 16:28:23 crc kubenswrapper[4772]: I0127 16:28:23.084354 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:28:23 crc kubenswrapper[4772]: I0127 16:28:23.091146 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:28:24 crc kubenswrapper[4772]: I0127 16:28:24.672942 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96c99cc4-7be7-49a2-bbc4-a16c3698e291" path="/var/lib/kubelet/pods/96c99cc4-7be7-49a2-bbc4-a16c3698e291/volumes" Jan 27 16:28:42 crc kubenswrapper[4772]: I0127 16:28:42.059031 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:28:42 crc kubenswrapper[4772]: I0127 16:28:42.059660 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:29:06 crc kubenswrapper[4772]: I0127 16:29:06.412068 4772 scope.go:117] "RemoveContainer" containerID="a62dadf36906064bb1b0580332d53e82d1766f5edc230560463a0bad481701be" Jan 27 16:29:12 crc kubenswrapper[4772]: I0127 16:29:12.059043 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:29:12 crc kubenswrapper[4772]: I0127 16:29:12.059596 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.058744 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.059517 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.059598 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.060638 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.060765 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" gracePeriod=600 Jan 27 16:29:42 crc kubenswrapper[4772]: E0127 16:29:42.183259 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.349318 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" exitCode=0 Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.349386 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b"} Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.349463 4772 scope.go:117] "RemoveContainer" containerID="f3a08a71f69d769f4a6a29d6cef13873c9dceaa6515bc086fcafc82c5f73a041" Jan 27 16:29:42 crc kubenswrapper[4772]: I0127 16:29:42.350083 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:29:42 crc kubenswrapper[4772]: E0127 16:29:42.350331 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:29:55 crc kubenswrapper[4772]: I0127 16:29:55.663254 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:29:55 crc kubenswrapper[4772]: E0127 16:29:55.663990 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.144406 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9"] Jan 27 16:30:00 crc kubenswrapper[4772]: E0127 16:30:00.145466 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96c99cc4-7be7-49a2-bbc4-a16c3698e291" containerName="mariadb-client" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.145484 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="96c99cc4-7be7-49a2-bbc4-a16c3698e291" containerName="mariadb-client" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.145711 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="96c99cc4-7be7-49a2-bbc4-a16c3698e291" containerName="mariadb-client" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.146409 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.151979 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.152222 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.153439 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9"] Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.235224 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aad11684-a5b7-4df1-9d18-5179c6113f66-secret-volume\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.235303 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cq4h\" (UniqueName: \"kubernetes.io/projected/aad11684-a5b7-4df1-9d18-5179c6113f66-kube-api-access-8cq4h\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.235411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aad11684-a5b7-4df1-9d18-5179c6113f66-config-volume\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.336965 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aad11684-a5b7-4df1-9d18-5179c6113f66-secret-volume\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.337012 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cq4h\" (UniqueName: \"kubernetes.io/projected/aad11684-a5b7-4df1-9d18-5179c6113f66-kube-api-access-8cq4h\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.337056 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aad11684-a5b7-4df1-9d18-5179c6113f66-config-volume\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.337938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aad11684-a5b7-4df1-9d18-5179c6113f66-config-volume\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.343293 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aad11684-a5b7-4df1-9d18-5179c6113f66-secret-volume\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.358382 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cq4h\" (UniqueName: \"kubernetes.io/projected/aad11684-a5b7-4df1-9d18-5179c6113f66-kube-api-access-8cq4h\") pod \"collect-profiles-29492190-nkwl9\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.470198 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:00 crc kubenswrapper[4772]: I0127 16:30:00.884077 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9"] Jan 27 16:30:01 crc kubenswrapper[4772]: I0127 16:30:01.504966 4772 generic.go:334] "Generic (PLEG): container finished" podID="aad11684-a5b7-4df1-9d18-5179c6113f66" containerID="114edebee04cbeb82762a8f7e28bf44b5665934fa2746ec43b3a0a20d9084515" exitCode=0 Jan 27 16:30:01 crc kubenswrapper[4772]: I0127 16:30:01.505013 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" event={"ID":"aad11684-a5b7-4df1-9d18-5179c6113f66","Type":"ContainerDied","Data":"114edebee04cbeb82762a8f7e28bf44b5665934fa2746ec43b3a0a20d9084515"} Jan 27 16:30:01 crc kubenswrapper[4772]: I0127 16:30:01.505039 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" event={"ID":"aad11684-a5b7-4df1-9d18-5179c6113f66","Type":"ContainerStarted","Data":"fb69b2bdd58763a29639c10b7607422660e39cd3e742d1e9cb84e4ab63d757cb"} Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.778672 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.878995 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aad11684-a5b7-4df1-9d18-5179c6113f66-config-volume\") pod \"aad11684-a5b7-4df1-9d18-5179c6113f66\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.879188 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cq4h\" (UniqueName: \"kubernetes.io/projected/aad11684-a5b7-4df1-9d18-5179c6113f66-kube-api-access-8cq4h\") pod \"aad11684-a5b7-4df1-9d18-5179c6113f66\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.879315 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aad11684-a5b7-4df1-9d18-5179c6113f66-secret-volume\") pod \"aad11684-a5b7-4df1-9d18-5179c6113f66\" (UID: \"aad11684-a5b7-4df1-9d18-5179c6113f66\") " Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.879928 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad11684-a5b7-4df1-9d18-5179c6113f66-config-volume" (OuterVolumeSpecName: "config-volume") pod "aad11684-a5b7-4df1-9d18-5179c6113f66" (UID: "aad11684-a5b7-4df1-9d18-5179c6113f66"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.885922 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad11684-a5b7-4df1-9d18-5179c6113f66-kube-api-access-8cq4h" (OuterVolumeSpecName: "kube-api-access-8cq4h") pod "aad11684-a5b7-4df1-9d18-5179c6113f66" (UID: "aad11684-a5b7-4df1-9d18-5179c6113f66"). InnerVolumeSpecName "kube-api-access-8cq4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.886301 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad11684-a5b7-4df1-9d18-5179c6113f66-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "aad11684-a5b7-4df1-9d18-5179c6113f66" (UID: "aad11684-a5b7-4df1-9d18-5179c6113f66"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.980531 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cq4h\" (UniqueName: \"kubernetes.io/projected/aad11684-a5b7-4df1-9d18-5179c6113f66-kube-api-access-8cq4h\") on node \"crc\" DevicePath \"\"" Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.980786 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/aad11684-a5b7-4df1-9d18-5179c6113f66-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:30:02 crc kubenswrapper[4772]: I0127 16:30:02.980796 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aad11684-a5b7-4df1-9d18-5179c6113f66-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:30:03 crc kubenswrapper[4772]: I0127 16:30:03.523036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" event={"ID":"aad11684-a5b7-4df1-9d18-5179c6113f66","Type":"ContainerDied","Data":"fb69b2bdd58763a29639c10b7607422660e39cd3e742d1e9cb84e4ab63d757cb"} Jan 27 16:30:03 crc kubenswrapper[4772]: I0127 16:30:03.523082 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb69b2bdd58763a29639c10b7607422660e39cd3e742d1e9cb84e4ab63d757cb" Jan 27 16:30:03 crc kubenswrapper[4772]: I0127 16:30:03.523094 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9" Jan 27 16:30:03 crc kubenswrapper[4772]: I0127 16:30:03.871853 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd"] Jan 27 16:30:03 crc kubenswrapper[4772]: I0127 16:30:03.876874 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-st6dd"] Jan 27 16:30:04 crc kubenswrapper[4772]: I0127 16:30:04.671488 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df82c0c4-9652-407e-b63d-17e2ccdb38aa" path="/var/lib/kubelet/pods/df82c0c4-9652-407e-b63d-17e2ccdb38aa/volumes" Jan 27 16:30:06 crc kubenswrapper[4772]: I0127 16:30:06.483484 4772 scope.go:117] "RemoveContainer" containerID="72ea0a33955c0509b888997e5b6ca0dc68de786a608fe5aae9035bbbf84ae773" Jan 27 16:30:06 crc kubenswrapper[4772]: I0127 16:30:06.664277 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:30:06 crc kubenswrapper[4772]: E0127 16:30:06.665023 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:30:18 crc kubenswrapper[4772]: I0127 16:30:18.664663 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:30:18 crc kubenswrapper[4772]: E0127 16:30:18.667681 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:30:29 crc kubenswrapper[4772]: I0127 16:30:29.663706 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:30:29 crc kubenswrapper[4772]: E0127 16:30:29.665805 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:30:41 crc kubenswrapper[4772]: I0127 16:30:41.664080 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:30:41 crc kubenswrapper[4772]: E0127 16:30:41.665415 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:30:44 crc kubenswrapper[4772]: I0127 16:30:44.996458 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qcsqq"] Jan 27 16:30:44 crc kubenswrapper[4772]: E0127 16:30:44.996988 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aad11684-a5b7-4df1-9d18-5179c6113f66" containerName="collect-profiles" Jan 27 16:30:44 crc kubenswrapper[4772]: I0127 16:30:44.996999 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aad11684-a5b7-4df1-9d18-5179c6113f66" containerName="collect-profiles" Jan 27 16:30:44 crc kubenswrapper[4772]: I0127 16:30:44.997134 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aad11684-a5b7-4df1-9d18-5179c6113f66" containerName="collect-profiles" Jan 27 16:30:44 crc kubenswrapper[4772]: I0127 16:30:44.998364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.012725 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcsqq"] Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.070442 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-utilities\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.070507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5r6z\" (UniqueName: \"kubernetes.io/projected/1546001c-59c1-4641-b1a6-cfd263698406-kube-api-access-n5r6z\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.070588 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-catalog-content\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.172074 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-utilities\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.172146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5r6z\" (UniqueName: \"kubernetes.io/projected/1546001c-59c1-4641-b1a6-cfd263698406-kube-api-access-n5r6z\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.172224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-catalog-content\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.172788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-catalog-content\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.172784 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-utilities\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.191767 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5r6z\" (UniqueName: \"kubernetes.io/projected/1546001c-59c1-4641-b1a6-cfd263698406-kube-api-access-n5r6z\") pod \"redhat-marketplace-qcsqq\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.373361 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.849051 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcsqq"] Jan 27 16:30:45 crc kubenswrapper[4772]: I0127 16:30:45.889353 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcsqq" event={"ID":"1546001c-59c1-4641-b1a6-cfd263698406","Type":"ContainerStarted","Data":"5264e359274425b89a615b65a92f4223cfc875211991d693035591b51c1a420b"} Jan 27 16:30:46 crc kubenswrapper[4772]: I0127 16:30:46.897386 4772 generic.go:334] "Generic (PLEG): container finished" podID="1546001c-59c1-4641-b1a6-cfd263698406" containerID="52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984" exitCode=0 Jan 27 16:30:46 crc kubenswrapper[4772]: I0127 16:30:46.897719 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcsqq" event={"ID":"1546001c-59c1-4641-b1a6-cfd263698406","Type":"ContainerDied","Data":"52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984"} Jan 27 16:30:48 crc kubenswrapper[4772]: I0127 16:30:47.912248 4772 generic.go:334] "Generic (PLEG): container finished" podID="1546001c-59c1-4641-b1a6-cfd263698406" containerID="453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1" exitCode=0 Jan 27 16:30:48 crc kubenswrapper[4772]: I0127 16:30:47.912371 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcsqq" event={"ID":"1546001c-59c1-4641-b1a6-cfd263698406","Type":"ContainerDied","Data":"453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1"} Jan 27 16:30:48 crc kubenswrapper[4772]: I0127 16:30:48.920441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcsqq" event={"ID":"1546001c-59c1-4641-b1a6-cfd263698406","Type":"ContainerStarted","Data":"6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf"} Jan 27 16:30:48 crc kubenswrapper[4772]: I0127 16:30:48.936502 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qcsqq" podStartSLOduration=3.404606659 podStartE2EDuration="4.936482557s" podCreationTimestamp="2026-01-27 16:30:44 +0000 UTC" firstStartedPulling="2026-01-27 16:30:46.898997168 +0000 UTC m=+5032.879606266" lastFinishedPulling="2026-01-27 16:30:48.430873066 +0000 UTC m=+5034.411482164" observedRunningTime="2026-01-27 16:30:48.933737148 +0000 UTC m=+5034.914346236" watchObservedRunningTime="2026-01-27 16:30:48.936482557 +0000 UTC m=+5034.917091655" Jan 27 16:30:52 crc kubenswrapper[4772]: I0127 16:30:52.663410 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:30:52 crc kubenswrapper[4772]: E0127 16:30:52.663973 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:30:55 crc kubenswrapper[4772]: I0127 16:30:55.374180 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:55 crc kubenswrapper[4772]: I0127 16:30:55.374490 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:55 crc kubenswrapper[4772]: I0127 16:30:55.417398 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:56 crc kubenswrapper[4772]: I0127 16:30:56.014519 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:56 crc kubenswrapper[4772]: I0127 16:30:56.062428 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcsqq"] Jan 27 16:30:57 crc kubenswrapper[4772]: I0127 16:30:57.989117 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qcsqq" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="registry-server" containerID="cri-o://6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf" gracePeriod=2 Jan 27 16:30:58 crc kubenswrapper[4772]: I0127 16:30:58.903995 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:58 crc kubenswrapper[4772]: I0127 16:30:58.999408 4772 generic.go:334] "Generic (PLEG): container finished" podID="1546001c-59c1-4641-b1a6-cfd263698406" containerID="6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf" exitCode=0 Jan 27 16:30:58 crc kubenswrapper[4772]: I0127 16:30:58.999461 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcsqq" event={"ID":"1546001c-59c1-4641-b1a6-cfd263698406","Type":"ContainerDied","Data":"6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf"} Jan 27 16:30:58 crc kubenswrapper[4772]: I0127 16:30:58.999493 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qcsqq" event={"ID":"1546001c-59c1-4641-b1a6-cfd263698406","Type":"ContainerDied","Data":"5264e359274425b89a615b65a92f4223cfc875211991d693035591b51c1a420b"} Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:58.999493 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qcsqq" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:58.999576 4772 scope.go:117] "RemoveContainer" containerID="6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.021103 4772 scope.go:117] "RemoveContainer" containerID="453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.041158 4772 scope.go:117] "RemoveContainer" containerID="52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.074403 4772 scope.go:117] "RemoveContainer" containerID="6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf" Jan 27 16:30:59 crc kubenswrapper[4772]: E0127 16:30:59.074808 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf\": container with ID starting with 6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf not found: ID does not exist" containerID="6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.074857 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf"} err="failed to get container status \"6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf\": rpc error: code = NotFound desc = could not find container \"6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf\": container with ID starting with 6c1eb65e8c06dfb0b2a366344e6cf4c14004706415f43851d993ecc534ead2cf not found: ID does not exist" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.074883 4772 scope.go:117] "RemoveContainer" containerID="453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1" Jan 27 16:30:59 crc kubenswrapper[4772]: E0127 16:30:59.075112 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1\": container with ID starting with 453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1 not found: ID does not exist" containerID="453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.075136 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1"} err="failed to get container status \"453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1\": rpc error: code = NotFound desc = could not find container \"453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1\": container with ID starting with 453a76da621aa580189ea4add3101e0ade10bc3693bb7f7aaeabf9887f214ee1 not found: ID does not exist" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.075152 4772 scope.go:117] "RemoveContainer" containerID="52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984" Jan 27 16:30:59 crc kubenswrapper[4772]: E0127 16:30:59.075420 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984\": container with ID starting with 52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984 not found: ID does not exist" containerID="52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.075446 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984"} err="failed to get container status \"52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984\": rpc error: code = NotFound desc = could not find container \"52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984\": container with ID starting with 52ff2eaac3c0328895089314015b1a4fde9c687e4fe951903045624423c75984 not found: ID does not exist" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.089680 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-utilities\") pod \"1546001c-59c1-4641-b1a6-cfd263698406\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.089780 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-catalog-content\") pod \"1546001c-59c1-4641-b1a6-cfd263698406\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.089875 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5r6z\" (UniqueName: \"kubernetes.io/projected/1546001c-59c1-4641-b1a6-cfd263698406-kube-api-access-n5r6z\") pod \"1546001c-59c1-4641-b1a6-cfd263698406\" (UID: \"1546001c-59c1-4641-b1a6-cfd263698406\") " Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.090935 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-utilities" (OuterVolumeSpecName: "utilities") pod "1546001c-59c1-4641-b1a6-cfd263698406" (UID: "1546001c-59c1-4641-b1a6-cfd263698406"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.097109 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1546001c-59c1-4641-b1a6-cfd263698406-kube-api-access-n5r6z" (OuterVolumeSpecName: "kube-api-access-n5r6z") pod "1546001c-59c1-4641-b1a6-cfd263698406" (UID: "1546001c-59c1-4641-b1a6-cfd263698406"). InnerVolumeSpecName "kube-api-access-n5r6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.112780 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1546001c-59c1-4641-b1a6-cfd263698406" (UID: "1546001c-59c1-4641-b1a6-cfd263698406"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.192287 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.192834 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1546001c-59c1-4641-b1a6-cfd263698406-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.192862 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5r6z\" (UniqueName: \"kubernetes.io/projected/1546001c-59c1-4641-b1a6-cfd263698406-kube-api-access-n5r6z\") on node \"crc\" DevicePath \"\"" Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.330676 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcsqq"] Jan 27 16:30:59 crc kubenswrapper[4772]: I0127 16:30:59.336340 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qcsqq"] Jan 27 16:31:00 crc kubenswrapper[4772]: I0127 16:31:00.674144 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1546001c-59c1-4641-b1a6-cfd263698406" path="/var/lib/kubelet/pods/1546001c-59c1-4641-b1a6-cfd263698406/volumes" Jan 27 16:31:05 crc kubenswrapper[4772]: I0127 16:31:05.663756 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:31:05 crc kubenswrapper[4772]: E0127 16:31:05.665042 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:31:17 crc kubenswrapper[4772]: I0127 16:31:17.663512 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:31:17 crc kubenswrapper[4772]: E0127 16:31:17.664352 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:31:31 crc kubenswrapper[4772]: I0127 16:31:31.662999 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:31:31 crc kubenswrapper[4772]: E0127 16:31:31.663868 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:31:46 crc kubenswrapper[4772]: I0127 16:31:46.663084 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:31:46 crc kubenswrapper[4772]: E0127 16:31:46.663816 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.435821 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4v5t2"] Jan 27 16:31:47 crc kubenswrapper[4772]: E0127 16:31:47.436289 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="extract-utilities" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.436309 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="extract-utilities" Jan 27 16:31:47 crc kubenswrapper[4772]: E0127 16:31:47.436330 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="registry-server" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.436338 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="registry-server" Jan 27 16:31:47 crc kubenswrapper[4772]: E0127 16:31:47.436359 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="extract-content" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.436368 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="extract-content" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.436542 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1546001c-59c1-4641-b1a6-cfd263698406" containerName="registry-server" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.437812 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.449096 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4v5t2"] Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.626065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-catalog-content\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.626147 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-utilities\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.626431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvdgz\" (UniqueName: \"kubernetes.io/projected/c92585ec-743a-472c-b4dd-c2626dea5440-kube-api-access-gvdgz\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.727885 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-utilities\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.727983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvdgz\" (UniqueName: \"kubernetes.io/projected/c92585ec-743a-472c-b4dd-c2626dea5440-kube-api-access-gvdgz\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.728015 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-catalog-content\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.728842 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-catalog-content\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.728965 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-utilities\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.749649 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvdgz\" (UniqueName: \"kubernetes.io/projected/c92585ec-743a-472c-b4dd-c2626dea5440-kube-api-access-gvdgz\") pod \"certified-operators-4v5t2\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:47 crc kubenswrapper[4772]: I0127 16:31:47.755545 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:48 crc kubenswrapper[4772]: I0127 16:31:48.205870 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4v5t2"] Jan 27 16:31:48 crc kubenswrapper[4772]: I0127 16:31:48.400859 4772 generic.go:334] "Generic (PLEG): container finished" podID="c92585ec-743a-472c-b4dd-c2626dea5440" containerID="2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb" exitCode=0 Jan 27 16:31:48 crc kubenswrapper[4772]: I0127 16:31:48.400903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4v5t2" event={"ID":"c92585ec-743a-472c-b4dd-c2626dea5440","Type":"ContainerDied","Data":"2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb"} Jan 27 16:31:48 crc kubenswrapper[4772]: I0127 16:31:48.400954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4v5t2" event={"ID":"c92585ec-743a-472c-b4dd-c2626dea5440","Type":"ContainerStarted","Data":"96225b9958d3fda24c7fec9dc8c8c03d59127b7c25a0ead4b4398213ed1f54aa"} Jan 27 16:31:48 crc kubenswrapper[4772]: I0127 16:31:48.402476 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:31:50 crc kubenswrapper[4772]: I0127 16:31:50.417809 4772 generic.go:334] "Generic (PLEG): container finished" podID="c92585ec-743a-472c-b4dd-c2626dea5440" containerID="b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345" exitCode=0 Jan 27 16:31:50 crc kubenswrapper[4772]: I0127 16:31:50.417864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4v5t2" event={"ID":"c92585ec-743a-472c-b4dd-c2626dea5440","Type":"ContainerDied","Data":"b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345"} Jan 27 16:31:51 crc kubenswrapper[4772]: I0127 16:31:51.427364 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4v5t2" event={"ID":"c92585ec-743a-472c-b4dd-c2626dea5440","Type":"ContainerStarted","Data":"349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54"} Jan 27 16:31:51 crc kubenswrapper[4772]: I0127 16:31:51.454575 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4v5t2" podStartSLOduration=2.045539772 podStartE2EDuration="4.45455218s" podCreationTimestamp="2026-01-27 16:31:47 +0000 UTC" firstStartedPulling="2026-01-27 16:31:48.402204391 +0000 UTC m=+5094.382813489" lastFinishedPulling="2026-01-27 16:31:50.811216799 +0000 UTC m=+5096.791825897" observedRunningTime="2026-01-27 16:31:51.445032145 +0000 UTC m=+5097.425641243" watchObservedRunningTime="2026-01-27 16:31:51.45455218 +0000 UTC m=+5097.435161278" Jan 27 16:31:57 crc kubenswrapper[4772]: I0127 16:31:57.756643 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:57 crc kubenswrapper[4772]: I0127 16:31:57.758133 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:57 crc kubenswrapper[4772]: I0127 16:31:57.800443 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:58 crc kubenswrapper[4772]: I0127 16:31:58.522427 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:31:58 crc kubenswrapper[4772]: I0127 16:31:58.573979 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4v5t2"] Jan 27 16:32:00 crc kubenswrapper[4772]: I0127 16:32:00.492585 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4v5t2" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="registry-server" containerID="cri-o://349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54" gracePeriod=2 Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.489870 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.508626 4772 generic.go:334] "Generic (PLEG): container finished" podID="c92585ec-743a-472c-b4dd-c2626dea5440" containerID="349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54" exitCode=0 Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.508679 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4v5t2" event={"ID":"c92585ec-743a-472c-b4dd-c2626dea5440","Type":"ContainerDied","Data":"349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54"} Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.508696 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4v5t2" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.508712 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4v5t2" event={"ID":"c92585ec-743a-472c-b4dd-c2626dea5440","Type":"ContainerDied","Data":"96225b9958d3fda24c7fec9dc8c8c03d59127b7c25a0ead4b4398213ed1f54aa"} Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.508738 4772 scope.go:117] "RemoveContainer" containerID="349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.527410 4772 scope.go:117] "RemoveContainer" containerID="b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.545996 4772 scope.go:117] "RemoveContainer" containerID="2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.578543 4772 scope.go:117] "RemoveContainer" containerID="349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54" Jan 27 16:32:01 crc kubenswrapper[4772]: E0127 16:32:01.578940 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54\": container with ID starting with 349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54 not found: ID does not exist" containerID="349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.578973 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54"} err="failed to get container status \"349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54\": rpc error: code = NotFound desc = could not find container \"349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54\": container with ID starting with 349c99af13ca6ed29ac408784b47f3d087af6dd7e6d509533e354880f0af0c54 not found: ID does not exist" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.578994 4772 scope.go:117] "RemoveContainer" containerID="b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345" Jan 27 16:32:01 crc kubenswrapper[4772]: E0127 16:32:01.579389 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345\": container with ID starting with b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345 not found: ID does not exist" containerID="b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.579445 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345"} err="failed to get container status \"b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345\": rpc error: code = NotFound desc = could not find container \"b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345\": container with ID starting with b61d7ffda814b78a2d4e085143511eac06adda78e765120d5bba47717b008345 not found: ID does not exist" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.579490 4772 scope.go:117] "RemoveContainer" containerID="2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb" Jan 27 16:32:01 crc kubenswrapper[4772]: E0127 16:32:01.579824 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb\": container with ID starting with 2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb not found: ID does not exist" containerID="2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.579853 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb"} err="failed to get container status \"2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb\": rpc error: code = NotFound desc = could not find container \"2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb\": container with ID starting with 2397a0459258571ef674ff88dc7d488cbbcfec7938b44eaae891914f57c1e0fb not found: ID does not exist" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.661747 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-utilities\") pod \"c92585ec-743a-472c-b4dd-c2626dea5440\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.662375 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:32:01 crc kubenswrapper[4772]: E0127 16:32:01.662651 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.662959 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-utilities" (OuterVolumeSpecName: "utilities") pod "c92585ec-743a-472c-b4dd-c2626dea5440" (UID: "c92585ec-743a-472c-b4dd-c2626dea5440"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.663019 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-catalog-content\") pod \"c92585ec-743a-472c-b4dd-c2626dea5440\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.666452 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvdgz\" (UniqueName: \"kubernetes.io/projected/c92585ec-743a-472c-b4dd-c2626dea5440-kube-api-access-gvdgz\") pod \"c92585ec-743a-472c-b4dd-c2626dea5440\" (UID: \"c92585ec-743a-472c-b4dd-c2626dea5440\") " Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.667101 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.673984 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92585ec-743a-472c-b4dd-c2626dea5440-kube-api-access-gvdgz" (OuterVolumeSpecName: "kube-api-access-gvdgz") pod "c92585ec-743a-472c-b4dd-c2626dea5440" (UID: "c92585ec-743a-472c-b4dd-c2626dea5440"). InnerVolumeSpecName "kube-api-access-gvdgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.708955 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c92585ec-743a-472c-b4dd-c2626dea5440" (UID: "c92585ec-743a-472c-b4dd-c2626dea5440"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.768401 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c92585ec-743a-472c-b4dd-c2626dea5440-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.768434 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvdgz\" (UniqueName: \"kubernetes.io/projected/c92585ec-743a-472c-b4dd-c2626dea5440-kube-api-access-gvdgz\") on node \"crc\" DevicePath \"\"" Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.835624 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4v5t2"] Jan 27 16:32:01 crc kubenswrapper[4772]: I0127 16:32:01.842197 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4v5t2"] Jan 27 16:32:02 crc kubenswrapper[4772]: I0127 16:32:02.670762 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" path="/var/lib/kubelet/pods/c92585ec-743a-472c-b4dd-c2626dea5440/volumes" Jan 27 16:32:14 crc kubenswrapper[4772]: I0127 16:32:14.666761 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:32:14 crc kubenswrapper[4772]: E0127 16:32:14.667454 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.093196 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Jan 27 16:32:23 crc kubenswrapper[4772]: E0127 16:32:23.094234 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="registry-server" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.094248 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="registry-server" Jan 27 16:32:23 crc kubenswrapper[4772]: E0127 16:32:23.094258 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="extract-content" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.094265 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="extract-content" Jan 27 16:32:23 crc kubenswrapper[4772]: E0127 16:32:23.094280 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="extract-utilities" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.094286 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="extract-utilities" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.094451 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92585ec-743a-472c-b4dd-c2626dea5440" containerName="registry-server" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.094966 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.097565 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jd6dc" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.100377 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.191153 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") " pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.191285 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km8d9\" (UniqueName: \"kubernetes.io/projected/7db35434-01e2-470d-bb27-8e30189936b3-kube-api-access-km8d9\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") " pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.292931 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") " pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.293027 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km8d9\" (UniqueName: \"kubernetes.io/projected/7db35434-01e2-470d-bb27-8e30189936b3-kube-api-access-km8d9\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") " pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.296040 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.296092 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/791abbe3041f2db8530b425c87a9deb2b029505cd287d16d9fe14f995d5e5eb5/globalmount\"" pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.317970 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km8d9\" (UniqueName: \"kubernetes.io/projected/7db35434-01e2-470d-bb27-8e30189936b3-kube-api-access-km8d9\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") " pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.331720 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6fda35a-22da-4119-a41a-9a4f5c51027a\") pod \"mariadb-copy-data\" (UID: \"7db35434-01e2-470d-bb27-8e30189936b3\") " pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.417932 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Jan 27 16:32:23 crc kubenswrapper[4772]: I0127 16:32:23.936063 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Jan 27 16:32:24 crc kubenswrapper[4772]: I0127 16:32:24.680625 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"7db35434-01e2-470d-bb27-8e30189936b3","Type":"ContainerStarted","Data":"1d7029faad9dc2bef8435c8c3a74852f826537085f88f1da26eca62aad4328e8"} Jan 27 16:32:24 crc kubenswrapper[4772]: I0127 16:32:24.680988 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"7db35434-01e2-470d-bb27-8e30189936b3","Type":"ContainerStarted","Data":"db1e9e0db949f173719176328889059cc417d2987fd47f63a4b1be6e62827be5"} Jan 27 16:32:24 crc kubenswrapper[4772]: I0127 16:32:24.699649 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.699630659 podStartE2EDuration="2.699630659s" podCreationTimestamp="2026-01-27 16:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:32:24.698496767 +0000 UTC m=+5130.679105885" watchObservedRunningTime="2026-01-27 16:32:24.699630659 +0000 UTC m=+5130.680239757" Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.357113 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.358689 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.368999 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.454502 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fpxp\" (UniqueName: \"kubernetes.io/projected/4adf565d-c266-4781-9107-fd05a16d3a53-kube-api-access-5fpxp\") pod \"mariadb-client\" (UID: \"4adf565d-c266-4781-9107-fd05a16d3a53\") " pod="openstack/mariadb-client" Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.555688 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fpxp\" (UniqueName: \"kubernetes.io/projected/4adf565d-c266-4781-9107-fd05a16d3a53-kube-api-access-5fpxp\") pod \"mariadb-client\" (UID: \"4adf565d-c266-4781-9107-fd05a16d3a53\") " pod="openstack/mariadb-client" Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.576258 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fpxp\" (UniqueName: \"kubernetes.io/projected/4adf565d-c266-4781-9107-fd05a16d3a53-kube-api-access-5fpxp\") pod \"mariadb-client\" (UID: \"4adf565d-c266-4781-9107-fd05a16d3a53\") " pod="openstack/mariadb-client" Jan 27 16:32:27 crc kubenswrapper[4772]: I0127 16:32:27.692248 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:28 crc kubenswrapper[4772]: I0127 16:32:28.106321 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:28 crc kubenswrapper[4772]: I0127 16:32:28.708074 4772 generic.go:334] "Generic (PLEG): container finished" podID="4adf565d-c266-4781-9107-fd05a16d3a53" containerID="f861f93267509f86f0bf58147bbab5f93435b373f74dada3332239c3cced3199" exitCode=0 Jan 27 16:32:28 crc kubenswrapper[4772]: I0127 16:32:28.708196 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4adf565d-c266-4781-9107-fd05a16d3a53","Type":"ContainerDied","Data":"f861f93267509f86f0bf58147bbab5f93435b373f74dada3332239c3cced3199"} Jan 27 16:32:28 crc kubenswrapper[4772]: I0127 16:32:28.708413 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4adf565d-c266-4781-9107-fd05a16d3a53","Type":"ContainerStarted","Data":"0f87006e2d0c3ada96d5c2183b202a9d37b5b94d9a888b23022a549612fbd148"} Jan 27 16:32:29 crc kubenswrapper[4772]: I0127 16:32:29.663290 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:32:29 crc kubenswrapper[4772]: E0127 16:32:29.663583 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.028377 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.050699 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_4adf565d-c266-4781-9107-fd05a16d3a53/mariadb-client/0.log" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.074111 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.080340 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.190707 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:30 crc kubenswrapper[4772]: E0127 16:32:30.191417 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adf565d-c266-4781-9107-fd05a16d3a53" containerName="mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.191440 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adf565d-c266-4781-9107-fd05a16d3a53" containerName="mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.191649 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adf565d-c266-4781-9107-fd05a16d3a53" containerName="mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.192514 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.197931 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.198483 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fpxp\" (UniqueName: \"kubernetes.io/projected/4adf565d-c266-4781-9107-fd05a16d3a53-kube-api-access-5fpxp\") pod \"4adf565d-c266-4781-9107-fd05a16d3a53\" (UID: \"4adf565d-c266-4781-9107-fd05a16d3a53\") " Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.207528 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adf565d-c266-4781-9107-fd05a16d3a53-kube-api-access-5fpxp" (OuterVolumeSpecName: "kube-api-access-5fpxp") pod "4adf565d-c266-4781-9107-fd05a16d3a53" (UID: "4adf565d-c266-4781-9107-fd05a16d3a53"). InnerVolumeSpecName "kube-api-access-5fpxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.300438 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pv8r\" (UniqueName: \"kubernetes.io/projected/6752954c-12b9-4b60-94bb-2f4676de7e6c-kube-api-access-4pv8r\") pod \"mariadb-client\" (UID: \"6752954c-12b9-4b60-94bb-2f4676de7e6c\") " pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.300600 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fpxp\" (UniqueName: \"kubernetes.io/projected/4adf565d-c266-4781-9107-fd05a16d3a53-kube-api-access-5fpxp\") on node \"crc\" DevicePath \"\"" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.402544 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pv8r\" (UniqueName: \"kubernetes.io/projected/6752954c-12b9-4b60-94bb-2f4676de7e6c-kube-api-access-4pv8r\") pod \"mariadb-client\" (UID: \"6752954c-12b9-4b60-94bb-2f4676de7e6c\") " pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.423389 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pv8r\" (UniqueName: \"kubernetes.io/projected/6752954c-12b9-4b60-94bb-2f4676de7e6c-kube-api-access-4pv8r\") pod \"mariadb-client\" (UID: \"6752954c-12b9-4b60-94bb-2f4676de7e6c\") " pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.536841 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.677030 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4adf565d-c266-4781-9107-fd05a16d3a53" path="/var/lib/kubelet/pods/4adf565d-c266-4781-9107-fd05a16d3a53/volumes" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.725047 4772 scope.go:117] "RemoveContainer" containerID="f861f93267509f86f0bf58147bbab5f93435b373f74dada3332239c3cced3199" Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.725160 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:30 crc kubenswrapper[4772]: W0127 16:32:30.936085 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6752954c_12b9_4b60_94bb_2f4676de7e6c.slice/crio-e9724199e84cd11ca0a68cfbb4deb346f80fb41c1fbd3adf14ea00d4e3521552 WatchSource:0}: Error finding container e9724199e84cd11ca0a68cfbb4deb346f80fb41c1fbd3adf14ea00d4e3521552: Status 404 returned error can't find the container with id e9724199e84cd11ca0a68cfbb4deb346f80fb41c1fbd3adf14ea00d4e3521552 Jan 27 16:32:30 crc kubenswrapper[4772]: I0127 16:32:30.936948 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:31 crc kubenswrapper[4772]: I0127 16:32:31.738039 4772 generic.go:334] "Generic (PLEG): container finished" podID="6752954c-12b9-4b60-94bb-2f4676de7e6c" containerID="f4958ec9744454169fb58baabe20204293a4ff4790174c9c2079b9801fd7028d" exitCode=0 Jan 27 16:32:31 crc kubenswrapper[4772]: I0127 16:32:31.738131 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6752954c-12b9-4b60-94bb-2f4676de7e6c","Type":"ContainerDied","Data":"f4958ec9744454169fb58baabe20204293a4ff4790174c9c2079b9801fd7028d"} Jan 27 16:32:31 crc kubenswrapper[4772]: I0127 16:32:31.738387 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6752954c-12b9-4b60-94bb-2f4676de7e6c","Type":"ContainerStarted","Data":"e9724199e84cd11ca0a68cfbb4deb346f80fb41c1fbd3adf14ea00d4e3521552"} Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.164131 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.180634 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_6752954c-12b9-4b60-94bb-2f4676de7e6c/mariadb-client/0.log" Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.203068 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.208828 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.348777 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pv8r\" (UniqueName: \"kubernetes.io/projected/6752954c-12b9-4b60-94bb-2f4676de7e6c-kube-api-access-4pv8r\") pod \"6752954c-12b9-4b60-94bb-2f4676de7e6c\" (UID: \"6752954c-12b9-4b60-94bb-2f4676de7e6c\") " Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.367526 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6752954c-12b9-4b60-94bb-2f4676de7e6c-kube-api-access-4pv8r" (OuterVolumeSpecName: "kube-api-access-4pv8r") pod "6752954c-12b9-4b60-94bb-2f4676de7e6c" (UID: "6752954c-12b9-4b60-94bb-2f4676de7e6c"). InnerVolumeSpecName "kube-api-access-4pv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.450469 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pv8r\" (UniqueName: \"kubernetes.io/projected/6752954c-12b9-4b60-94bb-2f4676de7e6c-kube-api-access-4pv8r\") on node \"crc\" DevicePath \"\"" Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.752294 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9724199e84cd11ca0a68cfbb4deb346f80fb41c1fbd3adf14ea00d4e3521552" Jan 27 16:32:33 crc kubenswrapper[4772]: I0127 16:32:33.752365 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Jan 27 16:32:34 crc kubenswrapper[4772]: I0127 16:32:34.674723 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6752954c-12b9-4b60-94bb-2f4676de7e6c" path="/var/lib/kubelet/pods/6752954c-12b9-4b60-94bb-2f4676de7e6c/volumes" Jan 27 16:32:40 crc kubenswrapper[4772]: I0127 16:32:40.663376 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:32:40 crc kubenswrapper[4772]: E0127 16:32:40.664374 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:32:53 crc kubenswrapper[4772]: I0127 16:32:53.662852 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:32:53 crc kubenswrapper[4772]: E0127 16:32:53.663565 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:33:04 crc kubenswrapper[4772]: I0127 16:33:04.668428 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:33:04 crc kubenswrapper[4772]: E0127 16:33:04.669326 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:33:06 crc kubenswrapper[4772]: I0127 16:33:06.605070 4772 scope.go:117] "RemoveContainer" containerID="0bdb0516e5ad0fcd11824f097428db46c1768aa20c8111ee97d7a876d3f00649" Jan 27 16:33:06 crc kubenswrapper[4772]: I0127 16:33:06.629258 4772 scope.go:117] "RemoveContainer" containerID="dbf03a502ec31eeb1c32c76f40edb6250c7c04cd55512165e350a627fdf8e1b7" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.615957 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 16:33:07 crc kubenswrapper[4772]: E0127 16:33:07.616608 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6752954c-12b9-4b60-94bb-2f4676de7e6c" containerName="mariadb-client" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.616626 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6752954c-12b9-4b60-94bb-2f4676de7e6c" containerName="mariadb-client" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.616816 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6752954c-12b9-4b60-94bb-2f4676de7e6c" containerName="mariadb-client" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.617635 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.620332 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.621248 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.621551 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-m86nf" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.635039 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.657385 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.658762 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.665132 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.665139 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.665430 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.665868 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-config\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.665992 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.666032 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49jj\" (UniqueName: \"kubernetes.io/projected/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-kube-api-access-b49jj\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.666076 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.666333 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.678052 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.697449 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768235 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbpbb\" (UniqueName: \"kubernetes.io/projected/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-kube-api-access-tbpbb\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768290 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slcvh\" (UniqueName: \"kubernetes.io/projected/3e19e84f-6d5e-455b-be78-ae3f04c925b7-kube-api-access-slcvh\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768351 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768376 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-config\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768404 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768514 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-config\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768537 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e19e84f-6d5e-455b-be78-ae3f04c925b7-config\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768553 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768587 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768608 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768631 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49jj\" (UniqueName: \"kubernetes.io/projected/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-kube-api-access-b49jj\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768664 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768699 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768717 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e19e84f-6d5e-455b-be78-ae3f04c925b7-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768776 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e19e84f-6d5e-455b-be78-ae3f04c925b7-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.768892 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.769313 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e19e84f-6d5e-455b-be78-ae3f04c925b7-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.769664 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.769956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-config\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.770343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.773196 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.773236 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3337b6a472fbda1da50af1d325792e8abe5f3c077d3bd6ebb24d51801a8d002b/globalmount\"" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.774613 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.785871 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49jj\" (UniqueName: \"kubernetes.io/projected/c7dba285-1db4-44d8-bdf4-9de6e8d80adb-kube-api-access-b49jj\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.797984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8751ebf2-d508-441e-bc69-ce17a1a3281f\") pod \"ovsdbserver-nb-0\" (UID: \"c7dba285-1db4-44d8-bdf4-9de6e8d80adb\") " pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.833317 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.836906 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.839135 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.839538 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h7s5j" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.839855 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.843656 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.851087 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.853115 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.862576 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.864191 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870229 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e19e84f-6d5e-455b-be78-ae3f04c925b7-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870295 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e19e84f-6d5e-455b-be78-ae3f04c925b7-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870347 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e19e84f-6d5e-455b-be78-ae3f04c925b7-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870404 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbpbb\" (UniqueName: \"kubernetes.io/projected/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-kube-api-access-tbpbb\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870429 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slcvh\" (UniqueName: \"kubernetes.io/projected/3e19e84f-6d5e-455b-be78-ae3f04c925b7-kube-api-access-slcvh\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870693 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870736 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-config\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870763 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e19e84f-6d5e-455b-be78-ae3f04c925b7-config\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.870844 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.871704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e19e84f-6d5e-455b-be78-ae3f04c925b7-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.871946 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.873082 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.873156 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.873311 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-config\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.873393 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e19e84f-6d5e-455b-be78-ae3f04c925b7-config\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.873706 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e19e84f-6d5e-455b-be78-ae3f04c925b7-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.886061 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.886102 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1c651a70b282613992325926fead9315e48244190040972e3c56616ccce54862/globalmount\"" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.886144 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e19e84f-6d5e-455b-be78-ae3f04c925b7-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.886694 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.886730 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a098fd0db4cba2f79026a162d263b402d8997356cd854ca8956233b9a2e11413/globalmount\"" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.888458 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.900237 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.904887 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slcvh\" (UniqueName: \"kubernetes.io/projected/3e19e84f-6d5e-455b-be78-ae3f04c925b7-kube-api-access-slcvh\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.905636 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbpbb\" (UniqueName: \"kubernetes.io/projected/d647bfb5-69e6-4b10-96ac-5f7fcd72514f-kube-api-access-tbpbb\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.925672 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b3f26a0e-bac9-4500-b628-aafb6e4b1c41\") pod \"ovsdbserver-nb-1\" (UID: \"d647bfb5-69e6-4b10-96ac-5f7fcd72514f\") " pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.931578 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d6414222-b877-4ed8-964f-8b67b2ad1611\") pod \"ovsdbserver-nb-2\" (UID: \"3e19e84f-6d5e-455b-be78-ae3f04c925b7\") " pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.942859 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972489 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f70d2878-d629-4772-b2a4-697fe18a3760-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972547 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7fw\" (UniqueName: \"kubernetes.io/projected/f70d2878-d629-4772-b2a4-697fe18a3760-kube-api-access-2l7fw\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972570 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6e7e5c6-90b8-4de9-ae6a-11034616734a-config\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972589 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6e7e5c6-90b8-4de9-ae6a-11034616734a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972626 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f70d2878-d629-4772-b2a4-697fe18a3760-config\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972642 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75n4l\" (UniqueName: \"kubernetes.io/projected/e6e7e5c6-90b8-4de9-ae6a-11034616734a-kube-api-access-75n4l\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972661 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972684 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70d2878-d629-4772-b2a4-697fe18a3760-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972703 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f70d2878-d629-4772-b2a4-697fe18a3760-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972744 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7e5c6-90b8-4de9-ae6a-11034616734a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972772 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.972792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6e7e5c6-90b8-4de9-ae6a-11034616734a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.980844 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:07 crc kubenswrapper[4772]: I0127 16:33:07.995659 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074151 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074267 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-config\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074296 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f70d2878-d629-4772-b2a4-697fe18a3760-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074350 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074371 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7fw\" (UniqueName: \"kubernetes.io/projected/f70d2878-d629-4772-b2a4-697fe18a3760-kube-api-access-2l7fw\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074389 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6e7e5c6-90b8-4de9-ae6a-11034616734a-config\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074409 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6e7e5c6-90b8-4de9-ae6a-11034616734a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074458 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074482 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2bk4\" (UniqueName: \"kubernetes.io/projected/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-kube-api-access-v2bk4\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074512 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f70d2878-d629-4772-b2a4-697fe18a3760-config\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074536 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75n4l\" (UniqueName: \"kubernetes.io/projected/e6e7e5c6-90b8-4de9-ae6a-11034616734a-kube-api-access-75n4l\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074560 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074596 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70d2878-d629-4772-b2a4-697fe18a3760-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.074620 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f70d2878-d629-4772-b2a4-697fe18a3760-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.075828 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6e7e5c6-90b8-4de9-ae6a-11034616734a-config\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.077111 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f70d2878-d629-4772-b2a4-697fe18a3760-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.078011 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e6e7e5c6-90b8-4de9-ae6a-11034616734a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.078472 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f70d2878-d629-4772-b2a4-697fe18a3760-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.078640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7e5c6-90b8-4de9-ae6a-11034616734a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.078756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.078818 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6e7e5c6-90b8-4de9-ae6a-11034616734a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.079087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f70d2878-d629-4772-b2a4-697fe18a3760-config\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.080251 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6e7e5c6-90b8-4de9-ae6a-11034616734a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.081720 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.081750 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc749d8993a334b6229d6a39c6b8333d09387f0305fdac2d4a2f13487d621faf/globalmount\"" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.082199 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.082247 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/64e51a673028894bd1a32e587e28872875075ce3051c01bb423c11d2fb7dc74d/globalmount\"" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.082388 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70d2878-d629-4772-b2a4-697fe18a3760-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.083184 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7e5c6-90b8-4de9-ae6a-11034616734a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.099787 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7fw\" (UniqueName: \"kubernetes.io/projected/f70d2878-d629-4772-b2a4-697fe18a3760-kube-api-access-2l7fw\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.100549 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75n4l\" (UniqueName: \"kubernetes.io/projected/e6e7e5c6-90b8-4de9-ae6a-11034616734a-kube-api-access-75n4l\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.128643 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9fb8f666-d05c-49d9-bf4f-89de48d00eee\") pod \"ovsdbserver-sb-0\" (UID: \"f70d2878-d629-4772-b2a4-697fe18a3760\") " pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.129453 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0f7281d-c4e2-4aa0-b339-d80e66a50bb8\") pod \"ovsdbserver-sb-1\" (UID: \"e6e7e5c6-90b8-4de9-ae6a-11034616734a\") " pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.162237 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.180636 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.180732 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-config\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.180760 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.180795 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.180846 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.180875 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2bk4\" (UniqueName: \"kubernetes.io/projected/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-kube-api-access-v2bk4\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.181437 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.181704 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-config\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.182473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.183134 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.183183 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ecb883243e8fa28e04401fc11c1cc3b48caf5d4b7f163305d47b1cc276850bf3/globalmount\"" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.185338 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.208254 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2bk4\" (UniqueName: \"kubernetes.io/projected/e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2-kube-api-access-v2bk4\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.212855 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a586ac9-f601-4b7a-8500-43d5056dca11\") pod \"ovsdbserver-sb-2\" (UID: \"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2\") " pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.248882 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.256396 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.474663 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.583446 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Jan 27 16:33:08 crc kubenswrapper[4772]: W0127 16:33:08.584365 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e19e84f_6d5e_455b_be78_ae3f04c925b7.slice/crio-48f52bd514e908fcf4864fb87e7374d832e8024e0d21ce8b7bcc182f3ed75e05 WatchSource:0}: Error finding container 48f52bd514e908fcf4864fb87e7374d832e8024e0d21ce8b7bcc182f3ed75e05: Status 404 returned error can't find the container with id 48f52bd514e908fcf4864fb87e7374d832e8024e0d21ce8b7bcc182f3ed75e05 Jan 27 16:33:08 crc kubenswrapper[4772]: W0127 16:33:08.674335 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd647bfb5_69e6_4b10_96ac_5f7fcd72514f.slice/crio-074db820aa56e1d3e692305b41061d7a8d773275074c4a730fe63818a3c3370a WatchSource:0}: Error finding container 074db820aa56e1d3e692305b41061d7a8d773275074c4a730fe63818a3c3370a: Status 404 returned error can't find the container with id 074db820aa56e1d3e692305b41061d7a8d773275074c4a730fe63818a3c3370a Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.675795 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.794777 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 16:33:08 crc kubenswrapper[4772]: I0127 16:33:08.884214 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.040775 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f70d2878-d629-4772-b2a4-697fe18a3760","Type":"ContainerStarted","Data":"bfb61b64ab9af272f753949b50afcd24287fb693d96febe095e3a45ae3c76473"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.042307 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d647bfb5-69e6-4b10-96ac-5f7fcd72514f","Type":"ContainerStarted","Data":"920ffd185ec2c4037d40e133e1096d42c4c2c1a7b8cd18f5f18c96e7c0a0ce1e"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.042330 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d647bfb5-69e6-4b10-96ac-5f7fcd72514f","Type":"ContainerStarted","Data":"074db820aa56e1d3e692305b41061d7a8d773275074c4a730fe63818a3c3370a"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.045483 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3e19e84f-6d5e-455b-be78-ae3f04c925b7","Type":"ContainerStarted","Data":"d1b4adcc8b2d207a74c0d46c315c74c8b626767da359aff95a2d871c59ab047c"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.045580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3e19e84f-6d5e-455b-be78-ae3f04c925b7","Type":"ContainerStarted","Data":"48f52bd514e908fcf4864fb87e7374d832e8024e0d21ce8b7bcc182f3ed75e05"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.047905 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c7dba285-1db4-44d8-bdf4-9de6e8d80adb","Type":"ContainerStarted","Data":"9a393749b3e8c476a8dbf09db1e5516a9dd70ecbfe7b88f62d8f1b2fc7c9ae27"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.047935 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c7dba285-1db4-44d8-bdf4-9de6e8d80adb","Type":"ContainerStarted","Data":"0cc7431e3cf8f4c3bcfa0b8a901f57d7a9bb4d5ab838913051b827521f69d36e"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.049386 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"e6e7e5c6-90b8-4de9-ae6a-11034616734a","Type":"ContainerStarted","Data":"ad163338ac498171cb6edadfe863d35da50a888044da8366acca9552904d5338"} Jan 27 16:33:09 crc kubenswrapper[4772]: I0127 16:33:09.482377 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.059690 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f70d2878-d629-4772-b2a4-697fe18a3760","Type":"ContainerStarted","Data":"b503e0fd0a7c27ff0d459996ac0d39e04861add0c7e9fbea62248cb0d2dc88c6"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.060037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f70d2878-d629-4772-b2a4-697fe18a3760","Type":"ContainerStarted","Data":"5432eb5f67260235f5ffdc38b09f50215a4ec704faf76ba8beca3ca36be7ae8d"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.063099 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2","Type":"ContainerStarted","Data":"fa7be627a5545262b52e10c28a8cf3593200f1814d5322934ff1be15d6d478d6"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.063143 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2","Type":"ContainerStarted","Data":"526b8505d35eaf62b78e16cc3783b92f6bf3d401410bc5db08769f91af92bd19"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.063160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2","Type":"ContainerStarted","Data":"cedf4ee1f5816248990bbe8ae54ca3ff34b16c8acb95ba79a94f3b775896fb38"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.065141 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d647bfb5-69e6-4b10-96ac-5f7fcd72514f","Type":"ContainerStarted","Data":"482c09bcb742316ddf7d6825e1bc9fe06aff97b0a598c35e0564d7e1e35cad4a"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.066694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"3e19e84f-6d5e-455b-be78-ae3f04c925b7","Type":"ContainerStarted","Data":"9063b11d3bd1855e7851cb3e5775f3995c3aecd5ae985951e817a21513a48b6a"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.068422 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c7dba285-1db4-44d8-bdf4-9de6e8d80adb","Type":"ContainerStarted","Data":"887dd01207c102e69b2f99fc63206e19204eda0c285f9027d0cc9857454da907"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.071196 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"e6e7e5c6-90b8-4de9-ae6a-11034616734a","Type":"ContainerStarted","Data":"6cd0d8b740bdd89690b3ad2480d1c3a5b7725050dc64d6cead32c124c96648f0"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.071226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"e6e7e5c6-90b8-4de9-ae6a-11034616734a","Type":"ContainerStarted","Data":"9129b568ab6a7ae493d7984fd465da0ccfbb3dc21791f3a4f8c8c37bf88d13fa"} Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.080497 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.080481958 podStartE2EDuration="4.080481958s" podCreationTimestamp="2026-01-27 16:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:10.075787034 +0000 UTC m=+5176.056396142" watchObservedRunningTime="2026-01-27 16:33:10.080481958 +0000 UTC m=+5176.061091056" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.104767 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.104742029 podStartE2EDuration="4.104742029s" podCreationTimestamp="2026-01-27 16:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:10.098913433 +0000 UTC m=+5176.079522531" watchObservedRunningTime="2026-01-27 16:33:10.104742029 +0000 UTC m=+5176.085351127" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.120553 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.120532529 podStartE2EDuration="4.120532529s" podCreationTimestamp="2026-01-27 16:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:10.118308346 +0000 UTC m=+5176.098917444" watchObservedRunningTime="2026-01-27 16:33:10.120532529 +0000 UTC m=+5176.101141627" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.142966 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.142944188 podStartE2EDuration="4.142944188s" podCreationTimestamp="2026-01-27 16:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:10.13634211 +0000 UTC m=+5176.116951218" watchObservedRunningTime="2026-01-27 16:33:10.142944188 +0000 UTC m=+5176.123553296" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.156715 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.15669592 podStartE2EDuration="4.15669592s" podCreationTimestamp="2026-01-27 16:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:10.155053433 +0000 UTC m=+5176.135662541" watchObservedRunningTime="2026-01-27 16:33:10.15669592 +0000 UTC m=+5176.137305018" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.182038 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.182017332 podStartE2EDuration="4.182017332s" podCreationTimestamp="2026-01-27 16:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:10.171701478 +0000 UTC m=+5176.152310576" watchObservedRunningTime="2026-01-27 16:33:10.182017332 +0000 UTC m=+5176.162626440" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.943811 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.981423 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:10 crc kubenswrapper[4772]: I0127 16:33:10.995780 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:11 crc kubenswrapper[4772]: I0127 16:33:11.162923 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:11 crc kubenswrapper[4772]: I0127 16:33:11.250525 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:11 crc kubenswrapper[4772]: I0127 16:33:11.257666 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:12 crc kubenswrapper[4772]: I0127 16:33:12.943071 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:12 crc kubenswrapper[4772]: I0127 16:33:12.981728 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:12 crc kubenswrapper[4772]: I0127 16:33:12.996124 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:13 crc kubenswrapper[4772]: I0127 16:33:13.162470 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:13 crc kubenswrapper[4772]: I0127 16:33:13.250241 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:13 crc kubenswrapper[4772]: I0127 16:33:13.257774 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:13 crc kubenswrapper[4772]: I0127 16:33:13.979589 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.016607 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.131204 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.142615 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.148903 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.177252 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.208833 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.284785 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.308797 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.325688 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.379435 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.413211 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.417103 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-qqlrk"] Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.418964 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.423216 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.448049 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-qqlrk"] Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.577928 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-qqlrk"] Jan 27 16:33:14 crc kubenswrapper[4772]: E0127 16:33:14.578713 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-mnwh9 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" podUID="5669174d-af83-4842-9e6f-bf2a32c40bf5" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.598947 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.599015 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-config\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.599079 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-dns-svc\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.599146 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwh9\" (UniqueName: \"kubernetes.io/projected/5669174d-af83-4842-9e6f-bf2a32c40bf5-kube-api-access-mnwh9\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.610465 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-746d786469-qv9pq"] Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.612023 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.618205 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.624141 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746d786469-qv9pq"] Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.700250 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwh9\" (UniqueName: \"kubernetes.io/projected/5669174d-af83-4842-9e6f-bf2a32c40bf5-kube-api-access-mnwh9\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.700349 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.700394 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-config\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.700462 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-dns-svc\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.701496 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-dns-svc\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.702954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-config\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.706152 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-ovsdbserver-nb\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.721561 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwh9\" (UniqueName: \"kubernetes.io/projected/5669174d-af83-4842-9e6f-bf2a32c40bf5-kube-api-access-mnwh9\") pod \"dnsmasq-dns-6f7b485f7-qqlrk\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.802151 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-sb\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.802323 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-dns-svc\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.802386 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-nb\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.802484 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-config\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.802542 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjfw7\" (UniqueName: \"kubernetes.io/projected/320b0d26-aa22-4854-ac28-4699f95fb37b-kube-api-access-zjfw7\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.903795 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-dns-svc\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.903866 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-nb\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.903906 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-config\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.904891 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjfw7\" (UniqueName: \"kubernetes.io/projected/320b0d26-aa22-4854-ac28-4699f95fb37b-kube-api-access-zjfw7\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.904847 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-config\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.904962 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-nb\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.905007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-dns-svc\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.905036 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-sb\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.905633 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-sb\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.922464 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjfw7\" (UniqueName: \"kubernetes.io/projected/320b0d26-aa22-4854-ac28-4699f95fb37b-kube-api-access-zjfw7\") pod \"dnsmasq-dns-746d786469-qv9pq\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:14 crc kubenswrapper[4772]: I0127 16:33:14.940424 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.106433 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.121469 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.209682 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-dns-svc\") pod \"5669174d-af83-4842-9e6f-bf2a32c40bf5\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.209749 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-config\") pod \"5669174d-af83-4842-9e6f-bf2a32c40bf5\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.209780 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-ovsdbserver-nb\") pod \"5669174d-af83-4842-9e6f-bf2a32c40bf5\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.209915 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnwh9\" (UniqueName: \"kubernetes.io/projected/5669174d-af83-4842-9e6f-bf2a32c40bf5-kube-api-access-mnwh9\") pod \"5669174d-af83-4842-9e6f-bf2a32c40bf5\" (UID: \"5669174d-af83-4842-9e6f-bf2a32c40bf5\") " Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.210364 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-config" (OuterVolumeSpecName: "config") pod "5669174d-af83-4842-9e6f-bf2a32c40bf5" (UID: "5669174d-af83-4842-9e6f-bf2a32c40bf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.210498 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5669174d-af83-4842-9e6f-bf2a32c40bf5" (UID: "5669174d-af83-4842-9e6f-bf2a32c40bf5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.210949 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5669174d-af83-4842-9e6f-bf2a32c40bf5" (UID: "5669174d-af83-4842-9e6f-bf2a32c40bf5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.211149 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.211184 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.217016 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5669174d-af83-4842-9e6f-bf2a32c40bf5-kube-api-access-mnwh9" (OuterVolumeSpecName: "kube-api-access-mnwh9") pod "5669174d-af83-4842-9e6f-bf2a32c40bf5" (UID: "5669174d-af83-4842-9e6f-bf2a32c40bf5"). InnerVolumeSpecName "kube-api-access-mnwh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.312891 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnwh9\" (UniqueName: \"kubernetes.io/projected/5669174d-af83-4842-9e6f-bf2a32c40bf5-kube-api-access-mnwh9\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.312934 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5669174d-af83-4842-9e6f-bf2a32c40bf5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.346879 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-746d786469-qv9pq"] Jan 27 16:33:15 crc kubenswrapper[4772]: I0127 16:33:15.662944 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:33:15 crc kubenswrapper[4772]: E0127 16:33:15.663524 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.114490 4772 generic.go:334] "Generic (PLEG): container finished" podID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerID="02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf" exitCode=0 Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.114584 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746d786469-qv9pq" event={"ID":"320b0d26-aa22-4854-ac28-4699f95fb37b","Type":"ContainerDied","Data":"02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf"} Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.114599 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f7b485f7-qqlrk" Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.114619 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746d786469-qv9pq" event={"ID":"320b0d26-aa22-4854-ac28-4699f95fb37b","Type":"ContainerStarted","Data":"551776dfcff4ae5d83241e31aab7653a20a9814c77297414bc41ae9b8bf9cecd"} Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.245921 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-qqlrk"] Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.256696 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f7b485f7-qqlrk"] Jan 27 16:33:16 crc kubenswrapper[4772]: I0127 16:33:16.685003 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5669174d-af83-4842-9e6f-bf2a32c40bf5" path="/var/lib/kubelet/pods/5669174d-af83-4842-9e6f-bf2a32c40bf5/volumes" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.079553 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.082903 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.086207 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.101691 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.126201 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746d786469-qv9pq" event={"ID":"320b0d26-aa22-4854-ac28-4699f95fb37b","Type":"ContainerStarted","Data":"07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691"} Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.126578 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.145032 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-746d786469-qv9pq" podStartSLOduration=3.145016254 podStartE2EDuration="3.145016254s" podCreationTimestamp="2026-01-27 16:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:17.140921987 +0000 UTC m=+5183.121531115" watchObservedRunningTime="2026-01-27 16:33:17.145016254 +0000 UTC m=+5183.125625352" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.242040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6d673b09-a15f-48fc-b399-212dc30fce29-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.242234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.242428 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsvvx\" (UniqueName: \"kubernetes.io/projected/6d673b09-a15f-48fc-b399-212dc30fce29-kube-api-access-wsvvx\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.344369 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6d673b09-a15f-48fc-b399-212dc30fce29-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.344459 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.344525 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsvvx\" (UniqueName: \"kubernetes.io/projected/6d673b09-a15f-48fc-b399-212dc30fce29-kube-api-access-wsvvx\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.347867 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.347917 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a0d0ab9791e17485b8426753f6dd9038e92be360783b83ca2c6a8c88a22753f/globalmount\"" pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.358141 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/6d673b09-a15f-48fc-b399-212dc30fce29-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.363243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsvvx\" (UniqueName: \"kubernetes.io/projected/6d673b09-a15f-48fc-b399-212dc30fce29-kube-api-access-wsvvx\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.378911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4ff4ab2b-2cd2-45d6-b694-fe0a25175676\") pod \"ovn-copy-data\" (UID: \"6d673b09-a15f-48fc-b399-212dc30fce29\") " pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.408999 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Jan 27 16:33:17 crc kubenswrapper[4772]: I0127 16:33:17.908904 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Jan 27 16:33:17 crc kubenswrapper[4772]: W0127 16:33:17.912434 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d673b09_a15f_48fc_b399_212dc30fce29.slice/crio-1357c90249e46be748745d14abaf363cb618d3df528722c24847f6ae9a771835 WatchSource:0}: Error finding container 1357c90249e46be748745d14abaf363cb618d3df528722c24847f6ae9a771835: Status 404 returned error can't find the container with id 1357c90249e46be748745d14abaf363cb618d3df528722c24847f6ae9a771835 Jan 27 16:33:18 crc kubenswrapper[4772]: I0127 16:33:18.133985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6d673b09-a15f-48fc-b399-212dc30fce29","Type":"ContainerStarted","Data":"40145951fa81f81ceb443f4e7a103616579dc22c8564338129dc861532e12471"} Jan 27 16:33:18 crc kubenswrapper[4772]: I0127 16:33:18.134408 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"6d673b09-a15f-48fc-b399-212dc30fce29","Type":"ContainerStarted","Data":"1357c90249e46be748745d14abaf363cb618d3df528722c24847f6ae9a771835"} Jan 27 16:33:18 crc kubenswrapper[4772]: I0127 16:33:18.155204 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.155159506 podStartE2EDuration="2.155159506s" podCreationTimestamp="2026-01-27 16:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:18.151211703 +0000 UTC m=+5184.131820811" watchObservedRunningTime="2026-01-27 16:33:18.155159506 +0000 UTC m=+5184.135768604" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.028949 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.035730 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.039003 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.039262 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-xbpms" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.039394 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.052922 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.153946 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c221996-e15f-4fe3-bc62-98aac08f546f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.154005 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfjn\" (UniqueName: \"kubernetes.io/projected/9c221996-e15f-4fe3-bc62-98aac08f546f-kube-api-access-mhfjn\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.154028 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c221996-e15f-4fe3-bc62-98aac08f546f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.154055 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c221996-e15f-4fe3-bc62-98aac08f546f-config\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.154138 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c221996-e15f-4fe3-bc62-98aac08f546f-scripts\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.255472 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c221996-e15f-4fe3-bc62-98aac08f546f-scripts\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.255619 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c221996-e15f-4fe3-bc62-98aac08f546f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.255650 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfjn\" (UniqueName: \"kubernetes.io/projected/9c221996-e15f-4fe3-bc62-98aac08f546f-kube-api-access-mhfjn\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.255677 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c221996-e15f-4fe3-bc62-98aac08f546f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.255701 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c221996-e15f-4fe3-bc62-98aac08f546f-config\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.256852 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c221996-e15f-4fe3-bc62-98aac08f546f-config\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.257549 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c221996-e15f-4fe3-bc62-98aac08f546f-scripts\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.258548 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9c221996-e15f-4fe3-bc62-98aac08f546f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.263986 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c221996-e15f-4fe3-bc62-98aac08f546f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.285050 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfjn\" (UniqueName: \"kubernetes.io/projected/9c221996-e15f-4fe3-bc62-98aac08f546f-kube-api-access-mhfjn\") pod \"ovn-northd-0\" (UID: \"9c221996-e15f-4fe3-bc62-98aac08f546f\") " pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.374059 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 16:33:23 crc kubenswrapper[4772]: I0127 16:33:23.811256 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 16:33:23 crc kubenswrapper[4772]: W0127 16:33:23.816789 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c221996_e15f_4fe3_bc62_98aac08f546f.slice/crio-1d9dd40e970c1b7120b85e055948429cd5f75b4a60d029ba58b71540e785fa8f WatchSource:0}: Error finding container 1d9dd40e970c1b7120b85e055948429cd5f75b4a60d029ba58b71540e785fa8f: Status 404 returned error can't find the container with id 1d9dd40e970c1b7120b85e055948429cd5f75b4a60d029ba58b71540e785fa8f Jan 27 16:33:23 crc kubenswrapper[4772]: E0127 16:33:23.953507 4772 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.134:34104->38.129.56.134:35895: write tcp 38.129.56.134:34104->38.129.56.134:35895: write: broken pipe Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.189612 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9c221996-e15f-4fe3-bc62-98aac08f546f","Type":"ContainerStarted","Data":"d06cea6eaa4e35aa000408c4b69138af9e6765b1225e007fdaeb79b67273c595"} Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.189650 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9c221996-e15f-4fe3-bc62-98aac08f546f","Type":"ContainerStarted","Data":"922270d0a409360020cdc5184f402a6dc5344a591d52c8f4c4caa5187ff1e91a"} Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.189661 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"9c221996-e15f-4fe3-bc62-98aac08f546f","Type":"ContainerStarted","Data":"1d9dd40e970c1b7120b85e055948429cd5f75b4a60d029ba58b71540e785fa8f"} Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.189763 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.207672 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.2076514760000001 podStartE2EDuration="1.207651476s" podCreationTimestamp="2026-01-27 16:33:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:24.206426561 +0000 UTC m=+5190.187035659" watchObservedRunningTime="2026-01-27 16:33:24.207651476 +0000 UTC m=+5190.188260574" Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.942498 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.994636 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-ndkll"] Jan 27 16:33:24 crc kubenswrapper[4772]: I0127 16:33:24.995121 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="dnsmasq-dns" containerID="cri-o://a355488297e660433fbaef5e701f58797760e94b791d017a807af73346bc3756" gracePeriod=10 Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.109665 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.244:5353: connect: connection refused" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.202583 4772 generic.go:334] "Generic (PLEG): container finished" podID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerID="a355488297e660433fbaef5e701f58797760e94b791d017a807af73346bc3756" exitCode=0 Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.203520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" event={"ID":"51a5db5c-8de5-441d-a8e9-7c07acc7df31","Type":"ContainerDied","Data":"a355488297e660433fbaef5e701f58797760e94b791d017a807af73346bc3756"} Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.481567 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.598643 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m72l\" (UniqueName: \"kubernetes.io/projected/51a5db5c-8de5-441d-a8e9-7c07acc7df31-kube-api-access-7m72l\") pod \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.598714 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-config\") pod \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.598854 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-dns-svc\") pod \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\" (UID: \"51a5db5c-8de5-441d-a8e9-7c07acc7df31\") " Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.606409 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a5db5c-8de5-441d-a8e9-7c07acc7df31-kube-api-access-7m72l" (OuterVolumeSpecName: "kube-api-access-7m72l") pod "51a5db5c-8de5-441d-a8e9-7c07acc7df31" (UID: "51a5db5c-8de5-441d-a8e9-7c07acc7df31"). InnerVolumeSpecName "kube-api-access-7m72l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.635137 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51a5db5c-8de5-441d-a8e9-7c07acc7df31" (UID: "51a5db5c-8de5-441d-a8e9-7c07acc7df31"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.635821 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-config" (OuterVolumeSpecName: "config") pod "51a5db5c-8de5-441d-a8e9-7c07acc7df31" (UID: "51a5db5c-8de5-441d-a8e9-7c07acc7df31"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.700691 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.700724 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m72l\" (UniqueName: \"kubernetes.io/projected/51a5db5c-8de5-441d-a8e9-7c07acc7df31-kube-api-access-7m72l\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:25 crc kubenswrapper[4772]: I0127 16:33:25.700734 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51a5db5c-8de5-441d-a8e9-7c07acc7df31-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.210774 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" event={"ID":"51a5db5c-8de5-441d-a8e9-7c07acc7df31","Type":"ContainerDied","Data":"cd506cf5289047aba524d3325375f9767877e5e9af047a505f68883c25d7ad72"} Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.211111 4772 scope.go:117] "RemoveContainer" containerID="a355488297e660433fbaef5e701f58797760e94b791d017a807af73346bc3756" Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.210866 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-ndkll" Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.242590 4772 scope.go:117] "RemoveContainer" containerID="2b48ddd2c1b4ddcb3e7d2673318ce52a0b89618a27b83de04520bd8160030f43" Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.245891 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-ndkll"] Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.252035 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-ndkll"] Jan 27 16:33:26 crc kubenswrapper[4772]: I0127 16:33:26.671864 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" path="/var/lib/kubelet/pods/51a5db5c-8de5-441d-a8e9-7c07acc7df31/volumes" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.071215 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-jb5ch"] Jan 27 16:33:28 crc kubenswrapper[4772]: E0127 16:33:28.071967 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="init" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.071984 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="init" Jan 27 16:33:28 crc kubenswrapper[4772]: E0127 16:33:28.072020 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="dnsmasq-dns" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.072028 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="dnsmasq-dns" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.072232 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a5db5c-8de5-441d-a8e9-7c07acc7df31" containerName="dnsmasq-dns" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.072897 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.083414 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jb5ch"] Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.137588 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b8sp\" (UniqueName: \"kubernetes.io/projected/5292a043-9ee5-4d14-a991-c50dbf4d136e-kube-api-access-6b8sp\") pod \"keystone-db-create-jb5ch\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.138450 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5292a043-9ee5-4d14-a991-c50dbf4d136e-operator-scripts\") pod \"keystone-db-create-jb5ch\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.179633 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0c73-account-create-update-nz44z"] Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.181630 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.184677 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.189892 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c73-account-create-update-nz44z"] Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.240129 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5292a043-9ee5-4d14-a991-c50dbf4d136e-operator-scripts\") pod \"keystone-db-create-jb5ch\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.240219 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b8sp\" (UniqueName: \"kubernetes.io/projected/5292a043-9ee5-4d14-a991-c50dbf4d136e-kube-api-access-6b8sp\") pod \"keystone-db-create-jb5ch\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.240961 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5292a043-9ee5-4d14-a991-c50dbf4d136e-operator-scripts\") pod \"keystone-db-create-jb5ch\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.261627 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b8sp\" (UniqueName: \"kubernetes.io/projected/5292a043-9ee5-4d14-a991-c50dbf4d136e-kube-api-access-6b8sp\") pod \"keystone-db-create-jb5ch\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.341263 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e8906f-73a7-4580-81c8-ec81439faea5-operator-scripts\") pod \"keystone-0c73-account-create-update-nz44z\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.341601 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khz8d\" (UniqueName: \"kubernetes.io/projected/d7e8906f-73a7-4580-81c8-ec81439faea5-kube-api-access-khz8d\") pod \"keystone-0c73-account-create-update-nz44z\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.395697 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.443674 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khz8d\" (UniqueName: \"kubernetes.io/projected/d7e8906f-73a7-4580-81c8-ec81439faea5-kube-api-access-khz8d\") pod \"keystone-0c73-account-create-update-nz44z\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.443788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e8906f-73a7-4580-81c8-ec81439faea5-operator-scripts\") pod \"keystone-0c73-account-create-update-nz44z\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.444751 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e8906f-73a7-4580-81c8-ec81439faea5-operator-scripts\") pod \"keystone-0c73-account-create-update-nz44z\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.463133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khz8d\" (UniqueName: \"kubernetes.io/projected/d7e8906f-73a7-4580-81c8-ec81439faea5-kube-api-access-khz8d\") pod \"keystone-0c73-account-create-update-nz44z\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.501018 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.883955 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-jb5ch"] Jan 27 16:33:28 crc kubenswrapper[4772]: W0127 16:33:28.885466 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5292a043_9ee5_4d14_a991_c50dbf4d136e.slice/crio-723120bd2af0be8b0958f15b4fca165d3b69967a008c9119ea9f2ca64aa3dad0 WatchSource:0}: Error finding container 723120bd2af0be8b0958f15b4fca165d3b69967a008c9119ea9f2ca64aa3dad0: Status 404 returned error can't find the container with id 723120bd2af0be8b0958f15b4fca165d3b69967a008c9119ea9f2ca64aa3dad0 Jan 27 16:33:28 crc kubenswrapper[4772]: I0127 16:33:28.944705 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0c73-account-create-update-nz44z"] Jan 27 16:33:29 crc kubenswrapper[4772]: I0127 16:33:29.236448 4772 generic.go:334] "Generic (PLEG): container finished" podID="5292a043-9ee5-4d14-a991-c50dbf4d136e" containerID="6b8b1c4b2ab6a42f7f2fdd3da73f914caa96bb6a3722955fd242601b775105ee" exitCode=0 Jan 27 16:33:29 crc kubenswrapper[4772]: I0127 16:33:29.236504 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jb5ch" event={"ID":"5292a043-9ee5-4d14-a991-c50dbf4d136e","Type":"ContainerDied","Data":"6b8b1c4b2ab6a42f7f2fdd3da73f914caa96bb6a3722955fd242601b775105ee"} Jan 27 16:33:29 crc kubenswrapper[4772]: I0127 16:33:29.236583 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jb5ch" event={"ID":"5292a043-9ee5-4d14-a991-c50dbf4d136e","Type":"ContainerStarted","Data":"723120bd2af0be8b0958f15b4fca165d3b69967a008c9119ea9f2ca64aa3dad0"} Jan 27 16:33:29 crc kubenswrapper[4772]: I0127 16:33:29.238043 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c73-account-create-update-nz44z" event={"ID":"d7e8906f-73a7-4580-81c8-ec81439faea5","Type":"ContainerStarted","Data":"058f6eb2f4f5195120a96ef6b4691590bd77114e18b6ed23148cdd3864b329b4"} Jan 27 16:33:29 crc kubenswrapper[4772]: I0127 16:33:29.238072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c73-account-create-update-nz44z" event={"ID":"d7e8906f-73a7-4580-81c8-ec81439faea5","Type":"ContainerStarted","Data":"a51cca167732ea4bcca0a9ad60e145a7ef6c1318026eaf99325988eb86cccb02"} Jan 27 16:33:29 crc kubenswrapper[4772]: I0127 16:33:29.265961 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-0c73-account-create-update-nz44z" podStartSLOduration=1.265943476 podStartE2EDuration="1.265943476s" podCreationTimestamp="2026-01-27 16:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:29.259629516 +0000 UTC m=+5195.240238614" watchObservedRunningTime="2026-01-27 16:33:29.265943476 +0000 UTC m=+5195.246552564" Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.250532 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7e8906f-73a7-4580-81c8-ec81439faea5" containerID="058f6eb2f4f5195120a96ef6b4691590bd77114e18b6ed23148cdd3864b329b4" exitCode=0 Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.250679 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c73-account-create-update-nz44z" event={"ID":"d7e8906f-73a7-4580-81c8-ec81439faea5","Type":"ContainerDied","Data":"058f6eb2f4f5195120a96ef6b4691590bd77114e18b6ed23148cdd3864b329b4"} Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.574617 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.663741 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:33:30 crc kubenswrapper[4772]: E0127 16:33:30.664041 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.688851 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5292a043-9ee5-4d14-a991-c50dbf4d136e-operator-scripts\") pod \"5292a043-9ee5-4d14-a991-c50dbf4d136e\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.689056 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b8sp\" (UniqueName: \"kubernetes.io/projected/5292a043-9ee5-4d14-a991-c50dbf4d136e-kube-api-access-6b8sp\") pod \"5292a043-9ee5-4d14-a991-c50dbf4d136e\" (UID: \"5292a043-9ee5-4d14-a991-c50dbf4d136e\") " Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.689696 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5292a043-9ee5-4d14-a991-c50dbf4d136e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5292a043-9ee5-4d14-a991-c50dbf4d136e" (UID: "5292a043-9ee5-4d14-a991-c50dbf4d136e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.695931 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5292a043-9ee5-4d14-a991-c50dbf4d136e-kube-api-access-6b8sp" (OuterVolumeSpecName: "kube-api-access-6b8sp") pod "5292a043-9ee5-4d14-a991-c50dbf4d136e" (UID: "5292a043-9ee5-4d14-a991-c50dbf4d136e"). InnerVolumeSpecName "kube-api-access-6b8sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.791055 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5292a043-9ee5-4d14-a991-c50dbf4d136e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:30 crc kubenswrapper[4772]: I0127 16:33:30.791088 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b8sp\" (UniqueName: \"kubernetes.io/projected/5292a043-9ee5-4d14-a991-c50dbf4d136e-kube-api-access-6b8sp\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.260629 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-jb5ch" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.260629 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-jb5ch" event={"ID":"5292a043-9ee5-4d14-a991-c50dbf4d136e","Type":"ContainerDied","Data":"723120bd2af0be8b0958f15b4fca165d3b69967a008c9119ea9f2ca64aa3dad0"} Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.261086 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="723120bd2af0be8b0958f15b4fca165d3b69967a008c9119ea9f2ca64aa3dad0" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.623151 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.705101 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e8906f-73a7-4580-81c8-ec81439faea5-operator-scripts\") pod \"d7e8906f-73a7-4580-81c8-ec81439faea5\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.705212 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khz8d\" (UniqueName: \"kubernetes.io/projected/d7e8906f-73a7-4580-81c8-ec81439faea5-kube-api-access-khz8d\") pod \"d7e8906f-73a7-4580-81c8-ec81439faea5\" (UID: \"d7e8906f-73a7-4580-81c8-ec81439faea5\") " Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.705837 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8906f-73a7-4580-81c8-ec81439faea5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7e8906f-73a7-4580-81c8-ec81439faea5" (UID: "d7e8906f-73a7-4580-81c8-ec81439faea5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.709410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e8906f-73a7-4580-81c8-ec81439faea5-kube-api-access-khz8d" (OuterVolumeSpecName: "kube-api-access-khz8d") pod "d7e8906f-73a7-4580-81c8-ec81439faea5" (UID: "d7e8906f-73a7-4580-81c8-ec81439faea5"). InnerVolumeSpecName "kube-api-access-khz8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.807279 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7e8906f-73a7-4580-81c8-ec81439faea5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:31 crc kubenswrapper[4772]: I0127 16:33:31.807323 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khz8d\" (UniqueName: \"kubernetes.io/projected/d7e8906f-73a7-4580-81c8-ec81439faea5-kube-api-access-khz8d\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:32 crc kubenswrapper[4772]: I0127 16:33:32.270097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0c73-account-create-update-nz44z" event={"ID":"d7e8906f-73a7-4580-81c8-ec81439faea5","Type":"ContainerDied","Data":"a51cca167732ea4bcca0a9ad60e145a7ef6c1318026eaf99325988eb86cccb02"} Jan 27 16:33:32 crc kubenswrapper[4772]: I0127 16:33:32.270353 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51cca167732ea4bcca0a9ad60e145a7ef6c1318026eaf99325988eb86cccb02" Jan 27 16:33:32 crc kubenswrapper[4772]: I0127 16:33:32.270153 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0c73-account-create-update-nz44z" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.431298 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.689922 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ls87v"] Jan 27 16:33:33 crc kubenswrapper[4772]: E0127 16:33:33.690300 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e8906f-73a7-4580-81c8-ec81439faea5" containerName="mariadb-account-create-update" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.690321 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e8906f-73a7-4580-81c8-ec81439faea5" containerName="mariadb-account-create-update" Jan 27 16:33:33 crc kubenswrapper[4772]: E0127 16:33:33.690345 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5292a043-9ee5-4d14-a991-c50dbf4d136e" containerName="mariadb-database-create" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.690353 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5292a043-9ee5-4d14-a991-c50dbf4d136e" containerName="mariadb-database-create" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.690522 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5292a043-9ee5-4d14-a991-c50dbf4d136e" containerName="mariadb-database-create" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.690545 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e8906f-73a7-4580-81c8-ec81439faea5" containerName="mariadb-account-create-update" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.691053 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.693511 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.697752 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p8kvv" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.697799 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.698003 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.704317 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ls87v"] Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.838550 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-combined-ca-bundle\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.838910 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-config-data\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.838978 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rmv\" (UniqueName: \"kubernetes.io/projected/9e179bb8-f9e4-434d-9636-84cc97d632fb-kube-api-access-24rmv\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.940914 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-combined-ca-bundle\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.941065 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-config-data\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.941093 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rmv\" (UniqueName: \"kubernetes.io/projected/9e179bb8-f9e4-434d-9636-84cc97d632fb-kube-api-access-24rmv\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.946853 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-combined-ca-bundle\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.947819 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-config-data\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:33 crc kubenswrapper[4772]: I0127 16:33:33.987836 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rmv\" (UniqueName: \"kubernetes.io/projected/9e179bb8-f9e4-434d-9636-84cc97d632fb-kube-api-access-24rmv\") pod \"keystone-db-sync-ls87v\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:34 crc kubenswrapper[4772]: I0127 16:33:34.011591 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:34 crc kubenswrapper[4772]: I0127 16:33:34.530976 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ls87v"] Jan 27 16:33:34 crc kubenswrapper[4772]: W0127 16:33:34.533869 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e179bb8_f9e4_434d_9636_84cc97d632fb.slice/crio-6d131a6d7fdbe158c526c8d07c4849e19f1f5996c01dc8bb375bca25b73effc4 WatchSource:0}: Error finding container 6d131a6d7fdbe158c526c8d07c4849e19f1f5996c01dc8bb375bca25b73effc4: Status 404 returned error can't find the container with id 6d131a6d7fdbe158c526c8d07c4849e19f1f5996c01dc8bb375bca25b73effc4 Jan 27 16:33:35 crc kubenswrapper[4772]: I0127 16:33:35.293070 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ls87v" event={"ID":"9e179bb8-f9e4-434d-9636-84cc97d632fb","Type":"ContainerStarted","Data":"5344c5595e4c241d1f3af5be472eb435a5a87e2e74a29b46f8cee0d5b6b1c135"} Jan 27 16:33:35 crc kubenswrapper[4772]: I0127 16:33:35.293433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ls87v" event={"ID":"9e179bb8-f9e4-434d-9636-84cc97d632fb","Type":"ContainerStarted","Data":"6d131a6d7fdbe158c526c8d07c4849e19f1f5996c01dc8bb375bca25b73effc4"} Jan 27 16:33:35 crc kubenswrapper[4772]: I0127 16:33:35.312802 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ls87v" podStartSLOduration=2.312782245 podStartE2EDuration="2.312782245s" podCreationTimestamp="2026-01-27 16:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:35.307880705 +0000 UTC m=+5201.288489823" watchObservedRunningTime="2026-01-27 16:33:35.312782245 +0000 UTC m=+5201.293391343" Jan 27 16:33:36 crc kubenswrapper[4772]: I0127 16:33:36.300580 4772 generic.go:334] "Generic (PLEG): container finished" podID="9e179bb8-f9e4-434d-9636-84cc97d632fb" containerID="5344c5595e4c241d1f3af5be472eb435a5a87e2e74a29b46f8cee0d5b6b1c135" exitCode=0 Jan 27 16:33:36 crc kubenswrapper[4772]: I0127 16:33:36.300627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ls87v" event={"ID":"9e179bb8-f9e4-434d-9636-84cc97d632fb","Type":"ContainerDied","Data":"5344c5595e4c241d1f3af5be472eb435a5a87e2e74a29b46f8cee0d5b6b1c135"} Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.668025 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.799016 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-combined-ca-bundle\") pod \"9e179bb8-f9e4-434d-9636-84cc97d632fb\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.799215 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-config-data\") pod \"9e179bb8-f9e4-434d-9636-84cc97d632fb\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.799260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24rmv\" (UniqueName: \"kubernetes.io/projected/9e179bb8-f9e4-434d-9636-84cc97d632fb-kube-api-access-24rmv\") pod \"9e179bb8-f9e4-434d-9636-84cc97d632fb\" (UID: \"9e179bb8-f9e4-434d-9636-84cc97d632fb\") " Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.804694 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e179bb8-f9e4-434d-9636-84cc97d632fb-kube-api-access-24rmv" (OuterVolumeSpecName: "kube-api-access-24rmv") pod "9e179bb8-f9e4-434d-9636-84cc97d632fb" (UID: "9e179bb8-f9e4-434d-9636-84cc97d632fb"). InnerVolumeSpecName "kube-api-access-24rmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.822321 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e179bb8-f9e4-434d-9636-84cc97d632fb" (UID: "9e179bb8-f9e4-434d-9636-84cc97d632fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.865784 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-config-data" (OuterVolumeSpecName: "config-data") pod "9e179bb8-f9e4-434d-9636-84cc97d632fb" (UID: "9e179bb8-f9e4-434d-9636-84cc97d632fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.901066 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.901120 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24rmv\" (UniqueName: \"kubernetes.io/projected/9e179bb8-f9e4-434d-9636-84cc97d632fb-kube-api-access-24rmv\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.901141 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e179bb8-f9e4-434d-9636-84cc97d632fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.984390 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55b86d75d9-f64vw"] Jan 27 16:33:37 crc kubenswrapper[4772]: E0127 16:33:37.984842 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e179bb8-f9e4-434d-9636-84cc97d632fb" containerName="keystone-db-sync" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.984868 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e179bb8-f9e4-434d-9636-84cc97d632fb" containerName="keystone-db-sync" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.985061 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e179bb8-f9e4-434d-9636-84cc97d632fb" containerName="keystone-db-sync" Jan 27 16:33:37 crc kubenswrapper[4772]: I0127 16:33:37.990417 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.011590 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8d7xp"] Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.012763 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.013398 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b86d75d9-f64vw"] Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.017938 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.029937 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8d7xp"] Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.103993 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-fernet-keys\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104067 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-credential-keys\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104191 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-nb\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104230 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-combined-ca-bundle\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104276 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks6jz\" (UniqueName: \"kubernetes.io/projected/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-kube-api-access-ks6jz\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104306 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-scripts\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104333 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-sb\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104498 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-dns-svc\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104594 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-config\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104674 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-config-data\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.104841 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqwn\" (UniqueName: \"kubernetes.io/projected/9f5334e8-ad35-41c0-b74f-d7283b625da0-kube-api-access-2fqwn\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206273 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqwn\" (UniqueName: \"kubernetes.io/projected/9f5334e8-ad35-41c0-b74f-d7283b625da0-kube-api-access-2fqwn\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206371 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-fernet-keys\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-credential-keys\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206453 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-nb\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206479 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-combined-ca-bundle\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206501 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks6jz\" (UniqueName: \"kubernetes.io/projected/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-kube-api-access-ks6jz\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-scripts\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-sb\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206591 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-dns-svc\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206628 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-config\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.206658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-config-data\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.209305 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-sb\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.209330 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-dns-svc\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.209507 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-config\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.209901 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-nb\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.212351 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-combined-ca-bundle\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.212472 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-fernet-keys\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.212647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-config-data\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.212777 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-scripts\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.213610 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-credential-keys\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.225057 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks6jz\" (UniqueName: \"kubernetes.io/projected/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-kube-api-access-ks6jz\") pod \"keystone-bootstrap-8d7xp\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.225239 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqwn\" (UniqueName: \"kubernetes.io/projected/9f5334e8-ad35-41c0-b74f-d7283b625da0-kube-api-access-2fqwn\") pod \"dnsmasq-dns-55b86d75d9-f64vw\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.311931 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.318226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ls87v" event={"ID":"9e179bb8-f9e4-434d-9636-84cc97d632fb","Type":"ContainerDied","Data":"6d131a6d7fdbe158c526c8d07c4849e19f1f5996c01dc8bb375bca25b73effc4"} Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.318273 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d131a6d7fdbe158c526c8d07c4849e19f1f5996c01dc8bb375bca25b73effc4" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.318335 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ls87v" Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.333905 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:38 crc kubenswrapper[4772]: W0127 16:33:38.745211 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdf73e5_d2c6_47e5_8408_3ec5cbf1d22a.slice/crio-8defd0de00852d313c592668c916eb4da5093d21c31e3dc3769c1148abb0d104 WatchSource:0}: Error finding container 8defd0de00852d313c592668c916eb4da5093d21c31e3dc3769c1148abb0d104: Status 404 returned error can't find the container with id 8defd0de00852d313c592668c916eb4da5093d21c31e3dc3769c1148abb0d104 Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.746233 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8d7xp"] Jan 27 16:33:38 crc kubenswrapper[4772]: I0127 16:33:38.817229 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55b86d75d9-f64vw"] Jan 27 16:33:39 crc kubenswrapper[4772]: I0127 16:33:39.328895 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" event={"ID":"9f5334e8-ad35-41c0-b74f-d7283b625da0","Type":"ContainerStarted","Data":"7afe0140c22bc4a9daf1f5cd3e32f3b90e22ae38f75fd3783a129a88072a7fc4"} Jan 27 16:33:39 crc kubenswrapper[4772]: I0127 16:33:39.329263 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" event={"ID":"9f5334e8-ad35-41c0-b74f-d7283b625da0","Type":"ContainerStarted","Data":"f24d4586c71be1b7413cfd65197ec5183ca5b0b700f7ea419468e58b997588e5"} Jan 27 16:33:39 crc kubenswrapper[4772]: I0127 16:33:39.330802 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8d7xp" event={"ID":"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a","Type":"ContainerStarted","Data":"1bbb692ab013dbd6aab71415abc05175b7c980ab72c48771e22a05f7a67573bd"} Jan 27 16:33:39 crc kubenswrapper[4772]: I0127 16:33:39.330845 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8d7xp" event={"ID":"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a","Type":"ContainerStarted","Data":"8defd0de00852d313c592668c916eb4da5093d21c31e3dc3769c1148abb0d104"} Jan 27 16:33:40 crc kubenswrapper[4772]: I0127 16:33:40.341852 4772 generic.go:334] "Generic (PLEG): container finished" podID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerID="7afe0140c22bc4a9daf1f5cd3e32f3b90e22ae38f75fd3783a129a88072a7fc4" exitCode=0 Jan 27 16:33:40 crc kubenswrapper[4772]: I0127 16:33:40.343591 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" event={"ID":"9f5334e8-ad35-41c0-b74f-d7283b625da0","Type":"ContainerDied","Data":"7afe0140c22bc4a9daf1f5cd3e32f3b90e22ae38f75fd3783a129a88072a7fc4"} Jan 27 16:33:40 crc kubenswrapper[4772]: I0127 16:33:40.390929 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8d7xp" podStartSLOduration=3.39090932 podStartE2EDuration="3.39090932s" podCreationTimestamp="2026-01-27 16:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:40.389465479 +0000 UTC m=+5206.370074577" watchObservedRunningTime="2026-01-27 16:33:40.39090932 +0000 UTC m=+5206.371518418" Jan 27 16:33:41 crc kubenswrapper[4772]: I0127 16:33:41.360002 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" event={"ID":"9f5334e8-ad35-41c0-b74f-d7283b625da0","Type":"ContainerStarted","Data":"e2902da210e724572224515decc30f5a1e9fa786701c1868e86464783fea99a9"} Jan 27 16:33:41 crc kubenswrapper[4772]: I0127 16:33:41.360501 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:42 crc kubenswrapper[4772]: I0127 16:33:42.662566 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:33:42 crc kubenswrapper[4772]: E0127 16:33:42.663223 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:33:44 crc kubenswrapper[4772]: I0127 16:33:44.384666 4772 generic.go:334] "Generic (PLEG): container finished" podID="abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" containerID="1bbb692ab013dbd6aab71415abc05175b7c980ab72c48771e22a05f7a67573bd" exitCode=0 Jan 27 16:33:44 crc kubenswrapper[4772]: I0127 16:33:44.384717 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8d7xp" event={"ID":"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a","Type":"ContainerDied","Data":"1bbb692ab013dbd6aab71415abc05175b7c980ab72c48771e22a05f7a67573bd"} Jan 27 16:33:44 crc kubenswrapper[4772]: I0127 16:33:44.407246 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" podStartSLOduration=7.407226391 podStartE2EDuration="7.407226391s" podCreationTimestamp="2026-01-27 16:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:41.386312343 +0000 UTC m=+5207.366921441" watchObservedRunningTime="2026-01-27 16:33:44.407226391 +0000 UTC m=+5210.387835509" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.792056 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.837712 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-scripts\") pod \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.837827 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-config-data\") pod \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.837897 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6jz\" (UniqueName: \"kubernetes.io/projected/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-kube-api-access-ks6jz\") pod \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.837962 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-credential-keys\") pod \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.837989 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-fernet-keys\") pod \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.838013 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-combined-ca-bundle\") pod \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\" (UID: \"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a\") " Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.843326 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-kube-api-access-ks6jz" (OuterVolumeSpecName: "kube-api-access-ks6jz") pod "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" (UID: "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a"). InnerVolumeSpecName "kube-api-access-ks6jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.844790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" (UID: "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.847382 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-scripts" (OuterVolumeSpecName: "scripts") pod "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" (UID: "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.857689 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" (UID: "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.877685 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" (UID: "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.880636 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-config-data" (OuterVolumeSpecName: "config-data") pod "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" (UID: "abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.948418 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.948462 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.948475 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks6jz\" (UniqueName: \"kubernetes.io/projected/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-kube-api-access-ks6jz\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.948485 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.948494 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:45 crc kubenswrapper[4772]: I0127 16:33:45.948502 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.414014 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8d7xp" event={"ID":"abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a","Type":"ContainerDied","Data":"8defd0de00852d313c592668c916eb4da5093d21c31e3dc3769c1148abb0d104"} Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.414454 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8defd0de00852d313c592668c916eb4da5093d21c31e3dc3769c1148abb0d104" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.414258 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8d7xp" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.505705 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8d7xp"] Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.510673 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8d7xp"] Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.568337 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8lxch"] Jan 27 16:33:46 crc kubenswrapper[4772]: E0127 16:33:46.568671 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" containerName="keystone-bootstrap" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.568693 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" containerName="keystone-bootstrap" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.568850 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" containerName="keystone-bootstrap" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.569394 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.573710 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.574655 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p8kvv" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.574850 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.575250 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.576583 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.580886 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8lxch"] Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.661815 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-fernet-keys\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.661883 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-config-data\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.661933 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-575p4\" (UniqueName: \"kubernetes.io/projected/e4b64159-de94-4b31-85ce-b845fdb391b3-kube-api-access-575p4\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.661987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-combined-ca-bundle\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.662011 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-scripts\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.662060 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-credential-keys\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.672893 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a" path="/var/lib/kubelet/pods/abdf73e5-d2c6-47e5-8408-3ec5cbf1d22a/volumes" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.764122 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-fernet-keys\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.764257 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-config-data\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.764342 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-575p4\" (UniqueName: \"kubernetes.io/projected/e4b64159-de94-4b31-85ce-b845fdb391b3-kube-api-access-575p4\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.764384 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-combined-ca-bundle\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.764413 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-scripts\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.764496 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-credential-keys\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.768795 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-scripts\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.769733 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-credential-keys\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.769813 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-combined-ca-bundle\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.776732 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-fernet-keys\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.777318 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-config-data\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.784383 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-575p4\" (UniqueName: \"kubernetes.io/projected/e4b64159-de94-4b31-85ce-b845fdb391b3-kube-api-access-575p4\") pod \"keystone-bootstrap-8lxch\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:46 crc kubenswrapper[4772]: I0127 16:33:46.899410 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:47 crc kubenswrapper[4772]: I0127 16:33:47.357118 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8lxch"] Jan 27 16:33:47 crc kubenswrapper[4772]: I0127 16:33:47.427446 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lxch" event={"ID":"e4b64159-de94-4b31-85ce-b845fdb391b3","Type":"ContainerStarted","Data":"ccf19bff553c261500929d96e5891dcc653cd8e08ae4a1b0c3230f64fe99f92f"} Jan 27 16:33:48 crc kubenswrapper[4772]: I0127 16:33:48.313422 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:33:48 crc kubenswrapper[4772]: I0127 16:33:48.388946 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746d786469-qv9pq"] Jan 27 16:33:48 crc kubenswrapper[4772]: I0127 16:33:48.389220 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-746d786469-qv9pq" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerName="dnsmasq-dns" containerID="cri-o://07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691" gracePeriod=10 Jan 27 16:33:48 crc kubenswrapper[4772]: I0127 16:33:48.460693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lxch" event={"ID":"e4b64159-de94-4b31-85ce-b845fdb391b3","Type":"ContainerStarted","Data":"761ce723d19214c2204c8e256e0ee21bf82feb31ce0c9f783c4a5819a97623f7"} Jan 27 16:33:48 crc kubenswrapper[4772]: I0127 16:33:48.520764 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8lxch" podStartSLOduration=2.520742832 podStartE2EDuration="2.520742832s" podCreationTimestamp="2026-01-27 16:33:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:48.501982468 +0000 UTC m=+5214.482591576" watchObservedRunningTime="2026-01-27 16:33:48.520742832 +0000 UTC m=+5214.501351940" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.348969 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.422524 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-dns-svc\") pod \"320b0d26-aa22-4854-ac28-4699f95fb37b\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.422783 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-nb\") pod \"320b0d26-aa22-4854-ac28-4699f95fb37b\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.422947 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-config\") pod \"320b0d26-aa22-4854-ac28-4699f95fb37b\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.423086 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjfw7\" (UniqueName: \"kubernetes.io/projected/320b0d26-aa22-4854-ac28-4699f95fb37b-kube-api-access-zjfw7\") pod \"320b0d26-aa22-4854-ac28-4699f95fb37b\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.423179 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-sb\") pod \"320b0d26-aa22-4854-ac28-4699f95fb37b\" (UID: \"320b0d26-aa22-4854-ac28-4699f95fb37b\") " Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.431478 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320b0d26-aa22-4854-ac28-4699f95fb37b-kube-api-access-zjfw7" (OuterVolumeSpecName: "kube-api-access-zjfw7") pod "320b0d26-aa22-4854-ac28-4699f95fb37b" (UID: "320b0d26-aa22-4854-ac28-4699f95fb37b"). InnerVolumeSpecName "kube-api-access-zjfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.460532 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "320b0d26-aa22-4854-ac28-4699f95fb37b" (UID: "320b0d26-aa22-4854-ac28-4699f95fb37b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.461071 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "320b0d26-aa22-4854-ac28-4699f95fb37b" (UID: "320b0d26-aa22-4854-ac28-4699f95fb37b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.473283 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "320b0d26-aa22-4854-ac28-4699f95fb37b" (UID: "320b0d26-aa22-4854-ac28-4699f95fb37b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.473779 4772 generic.go:334] "Generic (PLEG): container finished" podID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerID="07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691" exitCode=0 Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.473836 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-746d786469-qv9pq" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.473927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746d786469-qv9pq" event={"ID":"320b0d26-aa22-4854-ac28-4699f95fb37b","Type":"ContainerDied","Data":"07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691"} Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.474142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-746d786469-qv9pq" event={"ID":"320b0d26-aa22-4854-ac28-4699f95fb37b","Type":"ContainerDied","Data":"551776dfcff4ae5d83241e31aab7653a20a9814c77297414bc41ae9b8bf9cecd"} Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.474231 4772 scope.go:117] "RemoveContainer" containerID="07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.483659 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-config" (OuterVolumeSpecName: "config") pod "320b0d26-aa22-4854-ac28-4699f95fb37b" (UID: "320b0d26-aa22-4854-ac28-4699f95fb37b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.526010 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.526050 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.526063 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.526072 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjfw7\" (UniqueName: \"kubernetes.io/projected/320b0d26-aa22-4854-ac28-4699f95fb37b-kube-api-access-zjfw7\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.526080 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/320b0d26-aa22-4854-ac28-4699f95fb37b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.543730 4772 scope.go:117] "RemoveContainer" containerID="02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.561002 4772 scope.go:117] "RemoveContainer" containerID="07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691" Jan 27 16:33:49 crc kubenswrapper[4772]: E0127 16:33:49.561476 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691\": container with ID starting with 07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691 not found: ID does not exist" containerID="07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.561508 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691"} err="failed to get container status \"07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691\": rpc error: code = NotFound desc = could not find container \"07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691\": container with ID starting with 07f4aa1fb5d6e810824b11ccdad9666a53ee8ab3a0bbb958fb4483c8bf5dd691 not found: ID does not exist" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.561536 4772 scope.go:117] "RemoveContainer" containerID="02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf" Jan 27 16:33:49 crc kubenswrapper[4772]: E0127 16:33:49.561932 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf\": container with ID starting with 02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf not found: ID does not exist" containerID="02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.561974 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf"} err="failed to get container status \"02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf\": rpc error: code = NotFound desc = could not find container \"02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf\": container with ID starting with 02bb8fc05201bb2286e71696309cca885abf131326f1399ef983ae5f44f953bf not found: ID does not exist" Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.816768 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-746d786469-qv9pq"] Jan 27 16:33:49 crc kubenswrapper[4772]: I0127 16:33:49.824456 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-746d786469-qv9pq"] Jan 27 16:33:50 crc kubenswrapper[4772]: I0127 16:33:50.485212 4772 generic.go:334] "Generic (PLEG): container finished" podID="e4b64159-de94-4b31-85ce-b845fdb391b3" containerID="761ce723d19214c2204c8e256e0ee21bf82feb31ce0c9f783c4a5819a97623f7" exitCode=0 Jan 27 16:33:50 crc kubenswrapper[4772]: I0127 16:33:50.485276 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lxch" event={"ID":"e4b64159-de94-4b31-85ce-b845fdb391b3","Type":"ContainerDied","Data":"761ce723d19214c2204c8e256e0ee21bf82feb31ce0c9f783c4a5819a97623f7"} Jan 27 16:33:50 crc kubenswrapper[4772]: I0127 16:33:50.672007 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" path="/var/lib/kubelet/pods/320b0d26-aa22-4854-ac28-4699f95fb37b/volumes" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.850907 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.865569 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-config-data\") pod \"e4b64159-de94-4b31-85ce-b845fdb391b3\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.865661 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-575p4\" (UniqueName: \"kubernetes.io/projected/e4b64159-de94-4b31-85ce-b845fdb391b3-kube-api-access-575p4\") pod \"e4b64159-de94-4b31-85ce-b845fdb391b3\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.865836 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-credential-keys\") pod \"e4b64159-de94-4b31-85ce-b845fdb391b3\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.865904 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-combined-ca-bundle\") pod \"e4b64159-de94-4b31-85ce-b845fdb391b3\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.865991 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-fernet-keys\") pod \"e4b64159-de94-4b31-85ce-b845fdb391b3\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.866051 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-scripts\") pod \"e4b64159-de94-4b31-85ce-b845fdb391b3\" (UID: \"e4b64159-de94-4b31-85ce-b845fdb391b3\") " Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.874632 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b64159-de94-4b31-85ce-b845fdb391b3-kube-api-access-575p4" (OuterVolumeSpecName: "kube-api-access-575p4") pod "e4b64159-de94-4b31-85ce-b845fdb391b3" (UID: "e4b64159-de94-4b31-85ce-b845fdb391b3"). InnerVolumeSpecName "kube-api-access-575p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.875527 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-scripts" (OuterVolumeSpecName: "scripts") pod "e4b64159-de94-4b31-85ce-b845fdb391b3" (UID: "e4b64159-de94-4b31-85ce-b845fdb391b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.878305 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e4b64159-de94-4b31-85ce-b845fdb391b3" (UID: "e4b64159-de94-4b31-85ce-b845fdb391b3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.881028 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e4b64159-de94-4b31-85ce-b845fdb391b3" (UID: "e4b64159-de94-4b31-85ce-b845fdb391b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.901458 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4b64159-de94-4b31-85ce-b845fdb391b3" (UID: "e4b64159-de94-4b31-85ce-b845fdb391b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.907583 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-config-data" (OuterVolumeSpecName: "config-data") pod "e4b64159-de94-4b31-85ce-b845fdb391b3" (UID: "e4b64159-de94-4b31-85ce-b845fdb391b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.967860 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.968096 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.968224 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.968320 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-575p4\" (UniqueName: \"kubernetes.io/projected/e4b64159-de94-4b31-85ce-b845fdb391b3-kube-api-access-575p4\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.968385 4772 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:51 crc kubenswrapper[4772]: I0127 16:33:51.968438 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b64159-de94-4b31-85ce-b845fdb391b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.505384 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lxch" event={"ID":"e4b64159-de94-4b31-85ce-b845fdb391b3","Type":"ContainerDied","Data":"ccf19bff553c261500929d96e5891dcc653cd8e08ae4a1b0c3230f64fe99f92f"} Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.505428 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccf19bff553c261500929d96e5891dcc653cd8e08ae4a1b0c3230f64fe99f92f" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.505435 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lxch" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.940921 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56b5f9d6fc-hmz72"] Jan 27 16:33:52 crc kubenswrapper[4772]: E0127 16:33:52.941389 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerName="dnsmasq-dns" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.941406 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerName="dnsmasq-dns" Jan 27 16:33:52 crc kubenswrapper[4772]: E0127 16:33:52.941443 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerName="init" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.941450 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerName="init" Jan 27 16:33:52 crc kubenswrapper[4772]: E0127 16:33:52.941465 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b64159-de94-4b31-85ce-b845fdb391b3" containerName="keystone-bootstrap" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.941474 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b64159-de94-4b31-85ce-b845fdb391b3" containerName="keystone-bootstrap" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.941666 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="320b0d26-aa22-4854-ac28-4699f95fb37b" containerName="dnsmasq-dns" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.941679 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b64159-de94-4b31-85ce-b845fdb391b3" containerName="keystone-bootstrap" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.942312 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.945307 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.945742 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.945910 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.947812 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p8kvv" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.952451 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56b5f9d6fc-hmz72"] Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.983488 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-config-data\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.983748 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-fernet-keys\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.983892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxmj\" (UniqueName: \"kubernetes.io/projected/f6255916-357e-42c6-b936-27151f6b2260-kube-api-access-gdxmj\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.984006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-credential-keys\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.984101 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-combined-ca-bundle\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:52 crc kubenswrapper[4772]: I0127 16:33:52.984262 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-scripts\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.084845 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-config-data\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.084905 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-fernet-keys\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.084933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxmj\" (UniqueName: \"kubernetes.io/projected/f6255916-357e-42c6-b936-27151f6b2260-kube-api-access-gdxmj\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.084960 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-credential-keys\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.084979 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-combined-ca-bundle\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.085023 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-scripts\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.090270 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-scripts\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.090498 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-fernet-keys\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.091364 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-config-data\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.091754 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-credential-keys\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.102239 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxmj\" (UniqueName: \"kubernetes.io/projected/f6255916-357e-42c6-b936-27151f6b2260-kube-api-access-gdxmj\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.102924 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6255916-357e-42c6-b936-27151f6b2260-combined-ca-bundle\") pod \"keystone-56b5f9d6fc-hmz72\" (UID: \"f6255916-357e-42c6-b936-27151f6b2260\") " pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.272004 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:53 crc kubenswrapper[4772]: I0127 16:33:53.744032 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56b5f9d6fc-hmz72"] Jan 27 16:33:53 crc kubenswrapper[4772]: W0127 16:33:53.752420 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6255916_357e_42c6_b936_27151f6b2260.slice/crio-be66932cfc381c827b2a9362ad58833b6a258a4e7f30abcfb528cc35363e6071 WatchSource:0}: Error finding container be66932cfc381c827b2a9362ad58833b6a258a4e7f30abcfb528cc35363e6071: Status 404 returned error can't find the container with id be66932cfc381c827b2a9362ad58833b6a258a4e7f30abcfb528cc35363e6071 Jan 27 16:33:54 crc kubenswrapper[4772]: I0127 16:33:54.521401 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56b5f9d6fc-hmz72" event={"ID":"f6255916-357e-42c6-b936-27151f6b2260","Type":"ContainerStarted","Data":"b37f06a0442dbec54e0f8ffb189676e3082888ea33e8970849fb080c590e5548"} Jan 27 16:33:54 crc kubenswrapper[4772]: I0127 16:33:54.521447 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56b5f9d6fc-hmz72" event={"ID":"f6255916-357e-42c6-b936-27151f6b2260","Type":"ContainerStarted","Data":"be66932cfc381c827b2a9362ad58833b6a258a4e7f30abcfb528cc35363e6071"} Jan 27 16:33:54 crc kubenswrapper[4772]: I0127 16:33:54.521477 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:33:54 crc kubenswrapper[4772]: I0127 16:33:54.541351 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56b5f9d6fc-hmz72" podStartSLOduration=2.541328002 podStartE2EDuration="2.541328002s" podCreationTimestamp="2026-01-27 16:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:33:54.536142704 +0000 UTC m=+5220.516751812" watchObservedRunningTime="2026-01-27 16:33:54.541328002 +0000 UTC m=+5220.521937100" Jan 27 16:33:54 crc kubenswrapper[4772]: I0127 16:33:54.672154 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:33:54 crc kubenswrapper[4772]: E0127 16:33:54.672379 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:34:06 crc kubenswrapper[4772]: I0127 16:34:06.663668 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:34:06 crc kubenswrapper[4772]: E0127 16:34:06.664403 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:34:18 crc kubenswrapper[4772]: I0127 16:34:18.663221 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:34:18 crc kubenswrapper[4772]: E0127 16:34:18.664520 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:34:24 crc kubenswrapper[4772]: I0127 16:34:24.714706 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56b5f9d6fc-hmz72" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.090118 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.091793 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.094233 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-jj9xz" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.095155 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.095915 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.100719 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.177415 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv64z\" (UniqueName: \"kubernetes.io/projected/e05a90a8-dbbb-4e24-ac89-f30360482af9-kube-api-access-sv64z\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.177528 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e05a90a8-dbbb-4e24-ac89-f30360482af9-openstack-config-secret\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.177607 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e05a90a8-dbbb-4e24-ac89-f30360482af9-openstack-config\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.280265 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e05a90a8-dbbb-4e24-ac89-f30360482af9-openstack-config-secret\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.280426 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e05a90a8-dbbb-4e24-ac89-f30360482af9-openstack-config\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.280550 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv64z\" (UniqueName: \"kubernetes.io/projected/e05a90a8-dbbb-4e24-ac89-f30360482af9-kube-api-access-sv64z\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.281460 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e05a90a8-dbbb-4e24-ac89-f30360482af9-openstack-config\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.286625 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e05a90a8-dbbb-4e24-ac89-f30360482af9-openstack-config-secret\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.315367 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv64z\" (UniqueName: \"kubernetes.io/projected/e05a90a8-dbbb-4e24-ac89-f30360482af9-kube-api-access-sv64z\") pod \"openstackclient\" (UID: \"e05a90a8-dbbb-4e24-ac89-f30360482af9\") " pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.428621 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 16:34:28 crc kubenswrapper[4772]: I0127 16:34:28.881468 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 16:34:29 crc kubenswrapper[4772]: I0127 16:34:29.802478 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e05a90a8-dbbb-4e24-ac89-f30360482af9","Type":"ContainerStarted","Data":"9b53bc54d8825e655eba77337106d2bf01a57ffe64896ec01ce03446aa5bc757"} Jan 27 16:34:29 crc kubenswrapper[4772]: I0127 16:34:29.802809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e05a90a8-dbbb-4e24-ac89-f30360482af9","Type":"ContainerStarted","Data":"0caff9adca834db20dd1547787789d57fb2e7aea65070c5b0255817699f188e8"} Jan 27 16:34:29 crc kubenswrapper[4772]: I0127 16:34:29.818489 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.818468179 podStartE2EDuration="1.818468179s" podCreationTimestamp="2026-01-27 16:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:34:29.815914696 +0000 UTC m=+5255.796523794" watchObservedRunningTime="2026-01-27 16:34:29.818468179 +0000 UTC m=+5255.799077277" Jan 27 16:34:31 crc kubenswrapper[4772]: I0127 16:34:31.663043 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:34:31 crc kubenswrapper[4772]: E0127 16:34:31.663829 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:34:45 crc kubenswrapper[4772]: I0127 16:34:45.663444 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:34:46 crc kubenswrapper[4772]: I0127 16:34:46.948928 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"90e27c06727cf113f54cd7c0344565bfa447b15cc343fc7033a04f41dddb22f9"} Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.145054 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rxmk9"] Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.146459 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.154862 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2bff-account-create-update-nq7qs"] Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.156136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.158751 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.166640 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rxmk9"] Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.175460 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2bff-account-create-update-nq7qs"] Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.223545 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lpdq\" (UniqueName: \"kubernetes.io/projected/1fe4269c-3ff7-49b4-82c3-1a419f676f89-kube-api-access-2lpdq\") pod \"barbican-db-create-rxmk9\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.223597 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02ab9dd-317c-4787-aa94-0ad8dff15380-operator-scripts\") pod \"barbican-2bff-account-create-update-nq7qs\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.223691 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfkt4\" (UniqueName: \"kubernetes.io/projected/d02ab9dd-317c-4787-aa94-0ad8dff15380-kube-api-access-hfkt4\") pod \"barbican-2bff-account-create-update-nq7qs\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.223757 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fe4269c-3ff7-49b4-82c3-1a419f676f89-operator-scripts\") pod \"barbican-db-create-rxmk9\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.325247 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkt4\" (UniqueName: \"kubernetes.io/projected/d02ab9dd-317c-4787-aa94-0ad8dff15380-kube-api-access-hfkt4\") pod \"barbican-2bff-account-create-update-nq7qs\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.325317 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fe4269c-3ff7-49b4-82c3-1a419f676f89-operator-scripts\") pod \"barbican-db-create-rxmk9\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.325378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lpdq\" (UniqueName: \"kubernetes.io/projected/1fe4269c-3ff7-49b4-82c3-1a419f676f89-kube-api-access-2lpdq\") pod \"barbican-db-create-rxmk9\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.325400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02ab9dd-317c-4787-aa94-0ad8dff15380-operator-scripts\") pod \"barbican-2bff-account-create-update-nq7qs\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.326126 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fe4269c-3ff7-49b4-82c3-1a419f676f89-operator-scripts\") pod \"barbican-db-create-rxmk9\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.326324 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02ab9dd-317c-4787-aa94-0ad8dff15380-operator-scripts\") pod \"barbican-2bff-account-create-update-nq7qs\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.347483 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lpdq\" (UniqueName: \"kubernetes.io/projected/1fe4269c-3ff7-49b4-82c3-1a419f676f89-kube-api-access-2lpdq\") pod \"barbican-db-create-rxmk9\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.348158 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfkt4\" (UniqueName: \"kubernetes.io/projected/d02ab9dd-317c-4787-aa94-0ad8dff15380-kube-api-access-hfkt4\") pod \"barbican-2bff-account-create-update-nq7qs\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.471906 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.486306 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:05 crc kubenswrapper[4772]: I0127 16:36:05.999206 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rxmk9"] Jan 27 16:36:06 crc kubenswrapper[4772]: W0127 16:36:06.002415 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe4269c_3ff7_49b4_82c3_1a419f676f89.slice/crio-99421eccfe268ad945cb2a25bc785ee2156c9e63300fe417ab74e3ba4e349d22 WatchSource:0}: Error finding container 99421eccfe268ad945cb2a25bc785ee2156c9e63300fe417ab74e3ba4e349d22: Status 404 returned error can't find the container with id 99421eccfe268ad945cb2a25bc785ee2156c9e63300fe417ab74e3ba4e349d22 Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.024572 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2bff-account-create-update-nq7qs"] Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.782041 4772 generic.go:334] "Generic (PLEG): container finished" podID="1fe4269c-3ff7-49b4-82c3-1a419f676f89" containerID="5a8e2d8cff2b48fb4540fd3d3c996a83107f6a91a01287bf66fdb29b10e52c6b" exitCode=0 Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.782111 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rxmk9" event={"ID":"1fe4269c-3ff7-49b4-82c3-1a419f676f89","Type":"ContainerDied","Data":"5a8e2d8cff2b48fb4540fd3d3c996a83107f6a91a01287bf66fdb29b10e52c6b"} Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.782137 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rxmk9" event={"ID":"1fe4269c-3ff7-49b4-82c3-1a419f676f89","Type":"ContainerStarted","Data":"99421eccfe268ad945cb2a25bc785ee2156c9e63300fe417ab74e3ba4e349d22"} Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.783658 4772 generic.go:334] "Generic (PLEG): container finished" podID="d02ab9dd-317c-4787-aa94-0ad8dff15380" containerID="39e5a537d87039cbc6538b5672d3a7484ab8d88c163808f5672ea54c55115098" exitCode=0 Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.783700 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2bff-account-create-update-nq7qs" event={"ID":"d02ab9dd-317c-4787-aa94-0ad8dff15380","Type":"ContainerDied","Data":"39e5a537d87039cbc6538b5672d3a7484ab8d88c163808f5672ea54c55115098"} Jan 27 16:36:06 crc kubenswrapper[4772]: I0127 16:36:06.783729 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2bff-account-create-update-nq7qs" event={"ID":"d02ab9dd-317c-4787-aa94-0ad8dff15380","Type":"ContainerStarted","Data":"cde59099e96d076103d0ab96b2715ae4a771641a2db8b31a3e4cb2c31ede5713"} Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.355879 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.363702 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.409314 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lpdq\" (UniqueName: \"kubernetes.io/projected/1fe4269c-3ff7-49b4-82c3-1a419f676f89-kube-api-access-2lpdq\") pod \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.409406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02ab9dd-317c-4787-aa94-0ad8dff15380-operator-scripts\") pod \"d02ab9dd-317c-4787-aa94-0ad8dff15380\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.409466 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfkt4\" (UniqueName: \"kubernetes.io/projected/d02ab9dd-317c-4787-aa94-0ad8dff15380-kube-api-access-hfkt4\") pod \"d02ab9dd-317c-4787-aa94-0ad8dff15380\" (UID: \"d02ab9dd-317c-4787-aa94-0ad8dff15380\") " Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.409510 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fe4269c-3ff7-49b4-82c3-1a419f676f89-operator-scripts\") pod \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\" (UID: \"1fe4269c-3ff7-49b4-82c3-1a419f676f89\") " Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.410281 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fe4269c-3ff7-49b4-82c3-1a419f676f89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1fe4269c-3ff7-49b4-82c3-1a419f676f89" (UID: "1fe4269c-3ff7-49b4-82c3-1a419f676f89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.410431 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d02ab9dd-317c-4787-aa94-0ad8dff15380-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d02ab9dd-317c-4787-aa94-0ad8dff15380" (UID: "d02ab9dd-317c-4787-aa94-0ad8dff15380"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.417802 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02ab9dd-317c-4787-aa94-0ad8dff15380-kube-api-access-hfkt4" (OuterVolumeSpecName: "kube-api-access-hfkt4") pod "d02ab9dd-317c-4787-aa94-0ad8dff15380" (UID: "d02ab9dd-317c-4787-aa94-0ad8dff15380"). InnerVolumeSpecName "kube-api-access-hfkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.425689 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4269c-3ff7-49b4-82c3-1a419f676f89-kube-api-access-2lpdq" (OuterVolumeSpecName: "kube-api-access-2lpdq") pod "1fe4269c-3ff7-49b4-82c3-1a419f676f89" (UID: "1fe4269c-3ff7-49b4-82c3-1a419f676f89"). InnerVolumeSpecName "kube-api-access-2lpdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.510999 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfkt4\" (UniqueName: \"kubernetes.io/projected/d02ab9dd-317c-4787-aa94-0ad8dff15380-kube-api-access-hfkt4\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.511046 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fe4269c-3ff7-49b4-82c3-1a419f676f89-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.511060 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lpdq\" (UniqueName: \"kubernetes.io/projected/1fe4269c-3ff7-49b4-82c3-1a419f676f89-kube-api-access-2lpdq\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.511071 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d02ab9dd-317c-4787-aa94-0ad8dff15380-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.798548 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rxmk9" event={"ID":"1fe4269c-3ff7-49b4-82c3-1a419f676f89","Type":"ContainerDied","Data":"99421eccfe268ad945cb2a25bc785ee2156c9e63300fe417ab74e3ba4e349d22"} Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.798588 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99421eccfe268ad945cb2a25bc785ee2156c9e63300fe417ab74e3ba4e349d22" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.798595 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rxmk9" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.800362 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2bff-account-create-update-nq7qs" event={"ID":"d02ab9dd-317c-4787-aa94-0ad8dff15380","Type":"ContainerDied","Data":"cde59099e96d076103d0ab96b2715ae4a771641a2db8b31a3e4cb2c31ede5713"} Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.800389 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cde59099e96d076103d0ab96b2715ae4a771641a2db8b31a3e4cb2c31ede5713" Jan 27 16:36:08 crc kubenswrapper[4772]: I0127 16:36:08.800418 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2bff-account-create-update-nq7qs" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.554962 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2jctn"] Jan 27 16:36:10 crc kubenswrapper[4772]: E0127 16:36:10.555564 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe4269c-3ff7-49b4-82c3-1a419f676f89" containerName="mariadb-database-create" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.555577 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4269c-3ff7-49b4-82c3-1a419f676f89" containerName="mariadb-database-create" Jan 27 16:36:10 crc kubenswrapper[4772]: E0127 16:36:10.555587 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02ab9dd-317c-4787-aa94-0ad8dff15380" containerName="mariadb-account-create-update" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.555594 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02ab9dd-317c-4787-aa94-0ad8dff15380" containerName="mariadb-account-create-update" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.555790 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02ab9dd-317c-4787-aa94-0ad8dff15380" containerName="mariadb-account-create-update" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.555813 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe4269c-3ff7-49b4-82c3-1a419f676f89" containerName="mariadb-database-create" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.556364 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.558113 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-d4g2g" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.563000 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.571458 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2jctn"] Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.745885 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-combined-ca-bundle\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.745927 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z55sq\" (UniqueName: \"kubernetes.io/projected/370ff587-7186-4f81-83a2-886a15900229-kube-api-access-z55sq\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.746907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-db-sync-config-data\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.847741 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-db-sync-config-data\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.847832 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-combined-ca-bundle\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.847848 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z55sq\" (UniqueName: \"kubernetes.io/projected/370ff587-7186-4f81-83a2-886a15900229-kube-api-access-z55sq\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.853446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-combined-ca-bundle\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.854699 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-db-sync-config-data\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:10 crc kubenswrapper[4772]: I0127 16:36:10.875475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z55sq\" (UniqueName: \"kubernetes.io/projected/370ff587-7186-4f81-83a2-886a15900229-kube-api-access-z55sq\") pod \"barbican-db-sync-2jctn\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:11 crc kubenswrapper[4772]: I0127 16:36:11.171110 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:11 crc kubenswrapper[4772]: I0127 16:36:11.601655 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2jctn"] Jan 27 16:36:11 crc kubenswrapper[4772]: I0127 16:36:11.824705 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2jctn" event={"ID":"370ff587-7186-4f81-83a2-886a15900229","Type":"ContainerStarted","Data":"ccfecf4ef23c50d400b8ca7553bbf1eff4d9d62e8caff28f5a3562a7630cf393"} Jan 27 16:36:11 crc kubenswrapper[4772]: I0127 16:36:11.825053 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2jctn" event={"ID":"370ff587-7186-4f81-83a2-886a15900229","Type":"ContainerStarted","Data":"66fb01f7e2b25b680204b44292b0be570804c81d694aa738565de3c8455c3ac0"} Jan 27 16:36:11 crc kubenswrapper[4772]: I0127 16:36:11.842066 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2jctn" podStartSLOduration=1.8420488449999999 podStartE2EDuration="1.842048845s" podCreationTimestamp="2026-01-27 16:36:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:11.837809214 +0000 UTC m=+5357.818418322" watchObservedRunningTime="2026-01-27 16:36:11.842048845 +0000 UTC m=+5357.822657943" Jan 27 16:36:13 crc kubenswrapper[4772]: I0127 16:36:13.841113 4772 generic.go:334] "Generic (PLEG): container finished" podID="370ff587-7186-4f81-83a2-886a15900229" containerID="ccfecf4ef23c50d400b8ca7553bbf1eff4d9d62e8caff28f5a3562a7630cf393" exitCode=0 Jan 27 16:36:13 crc kubenswrapper[4772]: I0127 16:36:13.841211 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2jctn" event={"ID":"370ff587-7186-4f81-83a2-886a15900229","Type":"ContainerDied","Data":"ccfecf4ef23c50d400b8ca7553bbf1eff4d9d62e8caff28f5a3562a7630cf393"} Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.149901 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.327759 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-combined-ca-bundle\") pod \"370ff587-7186-4f81-83a2-886a15900229\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.327992 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z55sq\" (UniqueName: \"kubernetes.io/projected/370ff587-7186-4f81-83a2-886a15900229-kube-api-access-z55sq\") pod \"370ff587-7186-4f81-83a2-886a15900229\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.328021 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-db-sync-config-data\") pod \"370ff587-7186-4f81-83a2-886a15900229\" (UID: \"370ff587-7186-4f81-83a2-886a15900229\") " Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.333353 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "370ff587-7186-4f81-83a2-886a15900229" (UID: "370ff587-7186-4f81-83a2-886a15900229"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.333683 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370ff587-7186-4f81-83a2-886a15900229-kube-api-access-z55sq" (OuterVolumeSpecName: "kube-api-access-z55sq") pod "370ff587-7186-4f81-83a2-886a15900229" (UID: "370ff587-7186-4f81-83a2-886a15900229"). InnerVolumeSpecName "kube-api-access-z55sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.349050 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "370ff587-7186-4f81-83a2-886a15900229" (UID: "370ff587-7186-4f81-83a2-886a15900229"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.430959 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z55sq\" (UniqueName: \"kubernetes.io/projected/370ff587-7186-4f81-83a2-886a15900229-kube-api-access-z55sq\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.430999 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.431013 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/370ff587-7186-4f81-83a2-886a15900229-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.858400 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2jctn" event={"ID":"370ff587-7186-4f81-83a2-886a15900229","Type":"ContainerDied","Data":"66fb01f7e2b25b680204b44292b0be570804c81d694aa738565de3c8455c3ac0"} Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.858439 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fb01f7e2b25b680204b44292b0be570804c81d694aa738565de3c8455c3ac0" Jan 27 16:36:15 crc kubenswrapper[4772]: I0127 16:36:15.858476 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2jctn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.088300 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69b64c5dd7-dj9pw"] Jan 27 16:36:16 crc kubenswrapper[4772]: E0127 16:36:16.088984 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="370ff587-7186-4f81-83a2-886a15900229" containerName="barbican-db-sync" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.089004 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="370ff587-7186-4f81-83a2-886a15900229" containerName="barbican-db-sync" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.089280 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="370ff587-7186-4f81-83a2-886a15900229" containerName="barbican-db-sync" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.090371 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.093449 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.093685 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-d4g2g" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.093830 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.116071 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69b64c5dd7-dj9pw"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.136122 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75df4b6d74-xpp9t"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.138279 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.142726 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.152685 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75df4b6d74-xpp9t"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.219890 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d7c89dc7-g6rsn"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.231113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.246200 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d7c89dc7-g6rsn"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.250846 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-combined-ca-bundle\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.250910 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vx56\" (UniqueName: \"kubernetes.io/projected/0d49f4dc-fd69-4e43-9866-87af6da31197-kube-api-access-6vx56\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.250955 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-config-data\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.250984 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da34b58-6b43-4e25-bdec-39985c344819-logs\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.251007 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-config-data\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.251087 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-config-data-custom\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.251119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmbpb\" (UniqueName: \"kubernetes.io/projected/2da34b58-6b43-4e25-bdec-39985c344819-kube-api-access-nmbpb\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.251145 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-config-data-custom\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.251222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-combined-ca-bundle\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.251245 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d49f4dc-fd69-4e43-9866-87af6da31197-logs\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.315565 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dcb7f9846-lrk6t"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.319036 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.325603 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.341621 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcb7f9846-lrk6t"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352673 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-combined-ca-bundle\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352730 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d49f4dc-fd69-4e43-9866-87af6da31197-logs\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-combined-ca-bundle\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx56\" (UniqueName: \"kubernetes.io/projected/0d49f4dc-fd69-4e43-9866-87af6da31197-kube-api-access-6vx56\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352836 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-config\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352863 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-config-data\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352883 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352907 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-config-data\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352928 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da34b58-6b43-4e25-bdec-39985c344819-logs\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.352989 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwtxg\" (UniqueName: \"kubernetes.io/projected/53c70df7-cdca-4296-af71-b5b002484575-kube-api-access-qwtxg\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.353029 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-config-data-custom\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.353062 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmbpb\" (UniqueName: \"kubernetes.io/projected/2da34b58-6b43-4e25-bdec-39985c344819-kube-api-access-nmbpb\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.353090 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-config-data-custom\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.353123 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-dns-svc\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.353146 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.355435 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d49f4dc-fd69-4e43-9866-87af6da31197-logs\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.356955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2da34b58-6b43-4e25-bdec-39985c344819-logs\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.357121 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-combined-ca-bundle\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.358157 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-config-data\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.359433 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-combined-ca-bundle\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.371940 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-config-data-custom\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.373189 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2da34b58-6b43-4e25-bdec-39985c344819-config-data\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.376266 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d49f4dc-fd69-4e43-9866-87af6da31197-config-data-custom\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.377679 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx56\" (UniqueName: \"kubernetes.io/projected/0d49f4dc-fd69-4e43-9866-87af6da31197-kube-api-access-6vx56\") pod \"barbican-worker-69b64c5dd7-dj9pw\" (UID: \"0d49f4dc-fd69-4e43-9866-87af6da31197\") " pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.382893 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmbpb\" (UniqueName: \"kubernetes.io/projected/2da34b58-6b43-4e25-bdec-39985c344819-kube-api-access-nmbpb\") pod \"barbican-keystone-listener-75df4b6d74-xpp9t\" (UID: \"2da34b58-6b43-4e25-bdec-39985c344819\") " pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.414996 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69b64c5dd7-dj9pw" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.455371 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-logs\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.455424 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-combined-ca-bundle\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.455460 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.455800 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwtxg\" (UniqueName: \"kubernetes.io/projected/53c70df7-cdca-4296-af71-b5b002484575-kube-api-access-qwtxg\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.456017 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-dns-svc\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.456428 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.456926 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-dns-svc\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.456048 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.456981 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.457023 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z7sl\" (UniqueName: \"kubernetes.io/projected/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-kube-api-access-9z7sl\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.457109 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-config-data\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.457196 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-config-data-custom\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.457957 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-config\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.458008 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-config\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.478598 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwtxg\" (UniqueName: \"kubernetes.io/projected/53c70df7-cdca-4296-af71-b5b002484575-kube-api-access-qwtxg\") pod \"dnsmasq-dns-55d7c89dc7-g6rsn\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.504667 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.559857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z7sl\" (UniqueName: \"kubernetes.io/projected/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-kube-api-access-9z7sl\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.560252 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-config-data\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.560276 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-config-data-custom\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.560323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-logs\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.560343 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-combined-ca-bundle\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.561797 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-logs\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.562263 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.567467 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-config-data\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.577131 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-config-data-custom\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.579614 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-combined-ca-bundle\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.584373 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z7sl\" (UniqueName: \"kubernetes.io/projected/5c77e7c3-5320-4fa6-810d-bc819a6f7b03-kube-api-access-9z7sl\") pod \"barbican-api-7dcb7f9846-lrk6t\" (UID: \"5c77e7c3-5320-4fa6-810d-bc819a6f7b03\") " pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.644317 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.729587 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69b64c5dd7-dj9pw"] Jan 27 16:36:16 crc kubenswrapper[4772]: I0127 16:36:16.880343 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b64c5dd7-dj9pw" event={"ID":"0d49f4dc-fd69-4e43-9866-87af6da31197","Type":"ContainerStarted","Data":"13e2bafea465246f523200b9e1d2507ff9859908f73a49b661a410e4554ffc6a"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.081819 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75df4b6d74-xpp9t"] Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.112888 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dcb7f9846-lrk6t"] Jan 27 16:36:17 crc kubenswrapper[4772]: W0127 16:36:17.209950 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c77e7c3_5320_4fa6_810d_bc819a6f7b03.slice/crio-74a6a9add23994a7881d8a2122dcd2a1fba5bffb8857d7dcbb30dfad5a2c6e1c WatchSource:0}: Error finding container 74a6a9add23994a7881d8a2122dcd2a1fba5bffb8857d7dcbb30dfad5a2c6e1c: Status 404 returned error can't find the container with id 74a6a9add23994a7881d8a2122dcd2a1fba5bffb8857d7dcbb30dfad5a2c6e1c Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.220562 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d7c89dc7-g6rsn"] Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.891212 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcb7f9846-lrk6t" event={"ID":"5c77e7c3-5320-4fa6-810d-bc819a6f7b03","Type":"ContainerStarted","Data":"f12fb3f9c0d962fbef42c301cc8095cd2f46733470a2bdb51e65accd5f2cd4b5"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.891495 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcb7f9846-lrk6t" event={"ID":"5c77e7c3-5320-4fa6-810d-bc819a6f7b03","Type":"ContainerStarted","Data":"9eca56474104926e985ccce0312b20cf97a4756f8d45e934237a323350012d2a"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.891513 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dcb7f9846-lrk6t" event={"ID":"5c77e7c3-5320-4fa6-810d-bc819a6f7b03","Type":"ContainerStarted","Data":"74a6a9add23994a7881d8a2122dcd2a1fba5bffb8857d7dcbb30dfad5a2c6e1c"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.891530 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.892818 4772 generic.go:334] "Generic (PLEG): container finished" podID="53c70df7-cdca-4296-af71-b5b002484575" containerID="d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba" exitCode=0 Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.892910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" event={"ID":"53c70df7-cdca-4296-af71-b5b002484575","Type":"ContainerDied","Data":"d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.892954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" event={"ID":"53c70df7-cdca-4296-af71-b5b002484575","Type":"ContainerStarted","Data":"ab1dfeb7c7875a02f8dfd72be8da51340b47ef61cc5646a975b8f955d01b7ae0"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.895441 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b64c5dd7-dj9pw" event={"ID":"0d49f4dc-fd69-4e43-9866-87af6da31197","Type":"ContainerStarted","Data":"fe239a44b812bcf92e470bb7c00a935448d74f7c04681a2145ead95d654c9d60"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.895468 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69b64c5dd7-dj9pw" event={"ID":"0d49f4dc-fd69-4e43-9866-87af6da31197","Type":"ContainerStarted","Data":"c2da4b92862affafff9c7df896dc126644dbea4b5caddda1b9369b24e9864bde"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.899145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" event={"ID":"2da34b58-6b43-4e25-bdec-39985c344819","Type":"ContainerStarted","Data":"715392760b58db6867b76332ca5ce68acb4f57a47c4a2d79c281bad50a940b93"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.899218 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" event={"ID":"2da34b58-6b43-4e25-bdec-39985c344819","Type":"ContainerStarted","Data":"50d65912201083457d8e9653e4542690d4ae602716af39ac11b4a87b0d075aae"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.899237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" event={"ID":"2da34b58-6b43-4e25-bdec-39985c344819","Type":"ContainerStarted","Data":"906d0db148f7dd6d703853a32e42bfc3bd01bb165a51a1d14e445ba30a5155fa"} Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.918102 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dcb7f9846-lrk6t" podStartSLOduration=1.918085255 podStartE2EDuration="1.918085255s" podCreationTimestamp="2026-01-27 16:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:17.912415223 +0000 UTC m=+5363.893024331" watchObservedRunningTime="2026-01-27 16:36:17.918085255 +0000 UTC m=+5363.898694353" Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.945995 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75df4b6d74-xpp9t" podStartSLOduration=1.945953439 podStartE2EDuration="1.945953439s" podCreationTimestamp="2026-01-27 16:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:17.942360967 +0000 UTC m=+5363.922970065" watchObservedRunningTime="2026-01-27 16:36:17.945953439 +0000 UTC m=+5363.926562537" Jan 27 16:36:17 crc kubenswrapper[4772]: I0127 16:36:17.993038 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69b64c5dd7-dj9pw" podStartSLOduration=1.993021261 podStartE2EDuration="1.993021261s" podCreationTimestamp="2026-01-27 16:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:17.990302683 +0000 UTC m=+5363.970911781" watchObservedRunningTime="2026-01-27 16:36:17.993021261 +0000 UTC m=+5363.973630359" Jan 27 16:36:18 crc kubenswrapper[4772]: I0127 16:36:18.910467 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" event={"ID":"53c70df7-cdca-4296-af71-b5b002484575","Type":"ContainerStarted","Data":"6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70"} Jan 27 16:36:18 crc kubenswrapper[4772]: I0127 16:36:18.910894 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:18 crc kubenswrapper[4772]: I0127 16:36:18.911180 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:18 crc kubenswrapper[4772]: I0127 16:36:18.942452 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" podStartSLOduration=2.942430413 podStartE2EDuration="2.942430413s" podCreationTimestamp="2026-01-27 16:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:18.934117786 +0000 UTC m=+5364.914726894" watchObservedRunningTime="2026-01-27 16:36:18.942430413 +0000 UTC m=+5364.923039501" Jan 27 16:36:26 crc kubenswrapper[4772]: I0127 16:36:26.563313 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:36:26 crc kubenswrapper[4772]: I0127 16:36:26.637545 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b86d75d9-f64vw"] Jan 27 16:36:26 crc kubenswrapper[4772]: I0127 16:36:26.639032 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerName="dnsmasq-dns" containerID="cri-o://e2902da210e724572224515decc30f5a1e9fa786701c1868e86464783fea99a9" gracePeriod=10 Jan 27 16:36:26 crc kubenswrapper[4772]: I0127 16:36:26.987487 4772 generic.go:334] "Generic (PLEG): container finished" podID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerID="e2902da210e724572224515decc30f5a1e9fa786701c1868e86464783fea99a9" exitCode=0 Jan 27 16:36:26 crc kubenswrapper[4772]: I0127 16:36:26.987532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" event={"ID":"9f5334e8-ad35-41c0-b74f-d7283b625da0","Type":"ContainerDied","Data":"e2902da210e724572224515decc30f5a1e9fa786701c1868e86464783fea99a9"} Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.169983 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.257713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-dns-svc\") pod \"9f5334e8-ad35-41c0-b74f-d7283b625da0\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.258071 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-config\") pod \"9f5334e8-ad35-41c0-b74f-d7283b625da0\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.258130 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-sb\") pod \"9f5334e8-ad35-41c0-b74f-d7283b625da0\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.258220 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fqwn\" (UniqueName: \"kubernetes.io/projected/9f5334e8-ad35-41c0-b74f-d7283b625da0-kube-api-access-2fqwn\") pod \"9f5334e8-ad35-41c0-b74f-d7283b625da0\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.258286 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-nb\") pod \"9f5334e8-ad35-41c0-b74f-d7283b625da0\" (UID: \"9f5334e8-ad35-41c0-b74f-d7283b625da0\") " Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.269499 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5334e8-ad35-41c0-b74f-d7283b625da0-kube-api-access-2fqwn" (OuterVolumeSpecName: "kube-api-access-2fqwn") pod "9f5334e8-ad35-41c0-b74f-d7283b625da0" (UID: "9f5334e8-ad35-41c0-b74f-d7283b625da0"). InnerVolumeSpecName "kube-api-access-2fqwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.304756 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f5334e8-ad35-41c0-b74f-d7283b625da0" (UID: "9f5334e8-ad35-41c0-b74f-d7283b625da0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.308183 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f5334e8-ad35-41c0-b74f-d7283b625da0" (UID: "9f5334e8-ad35-41c0-b74f-d7283b625da0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.309044 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f5334e8-ad35-41c0-b74f-d7283b625da0" (UID: "9f5334e8-ad35-41c0-b74f-d7283b625da0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.316465 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-config" (OuterVolumeSpecName: "config") pod "9f5334e8-ad35-41c0-b74f-d7283b625da0" (UID: "9f5334e8-ad35-41c0-b74f-d7283b625da0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.361075 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.361118 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.361133 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.361147 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fqwn\" (UniqueName: \"kubernetes.io/projected/9f5334e8-ad35-41c0-b74f-d7283b625da0-kube-api-access-2fqwn\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:27 crc kubenswrapper[4772]: I0127 16:36:27.361156 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f5334e8-ad35-41c0-b74f-d7283b625da0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.000442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" event={"ID":"9f5334e8-ad35-41c0-b74f-d7283b625da0","Type":"ContainerDied","Data":"f24d4586c71be1b7413cfd65197ec5183ca5b0b700f7ea419468e58b997588e5"} Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.000505 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55b86d75d9-f64vw" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.001210 4772 scope.go:117] "RemoveContainer" containerID="e2902da210e724572224515decc30f5a1e9fa786701c1868e86464783fea99a9" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.052885 4772 scope.go:117] "RemoveContainer" containerID="7afe0140c22bc4a9daf1f5cd3e32f3b90e22ae38f75fd3783a129a88072a7fc4" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.053001 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55b86d75d9-f64vw"] Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.060806 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55b86d75d9-f64vw"] Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.091362 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-74dnd"] Jan 27 16:36:28 crc kubenswrapper[4772]: E0127 16:36:28.092700 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerName="init" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.092725 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerName="init" Jan 27 16:36:28 crc kubenswrapper[4772]: E0127 16:36:28.092755 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerName="dnsmasq-dns" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.092763 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerName="dnsmasq-dns" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.092965 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" containerName="dnsmasq-dns" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.097750 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.107201 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-74dnd"] Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.118953 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.178453 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-utilities\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.178506 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr8tc\" (UniqueName: \"kubernetes.io/projected/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-kube-api-access-cr8tc\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.178531 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-catalog-content\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.265070 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dcb7f9846-lrk6t" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.282087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-utilities\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.282129 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cr8tc\" (UniqueName: \"kubernetes.io/projected/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-kube-api-access-cr8tc\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.282179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-catalog-content\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.282644 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-catalog-content\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.284859 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-utilities\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.312336 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr8tc\" (UniqueName: \"kubernetes.io/projected/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-kube-api-access-cr8tc\") pod \"redhat-operators-74dnd\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.430118 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.681049 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5334e8-ad35-41c0-b74f-d7283b625da0" path="/var/lib/kubelet/pods/9f5334e8-ad35-41c0-b74f-d7283b625da0/volumes" Jan 27 16:36:28 crc kubenswrapper[4772]: I0127 16:36:28.685018 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-74dnd"] Jan 27 16:36:29 crc kubenswrapper[4772]: I0127 16:36:29.022822 4772 generic.go:334] "Generic (PLEG): container finished" podID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerID="4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333" exitCode=0 Jan 27 16:36:29 crc kubenswrapper[4772]: I0127 16:36:29.022923 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerDied","Data":"4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333"} Jan 27 16:36:29 crc kubenswrapper[4772]: I0127 16:36:29.023266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerStarted","Data":"edeeb734fb1a3ced8ebfb84b4f3f42b1d212bd81615ff209859424b50ce18695"} Jan 27 16:36:30 crc kubenswrapper[4772]: I0127 16:36:30.032805 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerStarted","Data":"b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0"} Jan 27 16:36:31 crc kubenswrapper[4772]: I0127 16:36:31.047473 4772 generic.go:334] "Generic (PLEG): container finished" podID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerID="b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0" exitCode=0 Jan 27 16:36:31 crc kubenswrapper[4772]: I0127 16:36:31.047530 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerDied","Data":"b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0"} Jan 27 16:36:32 crc kubenswrapper[4772]: I0127 16:36:32.061216 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerStarted","Data":"ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168"} Jan 27 16:36:32 crc kubenswrapper[4772]: I0127 16:36:32.083653 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-74dnd" podStartSLOduration=1.66403699 podStartE2EDuration="4.083633477s" podCreationTimestamp="2026-01-27 16:36:28 +0000 UTC" firstStartedPulling="2026-01-27 16:36:29.027427004 +0000 UTC m=+5375.008036102" lastFinishedPulling="2026-01-27 16:36:31.447023481 +0000 UTC m=+5377.427632589" observedRunningTime="2026-01-27 16:36:32.082488835 +0000 UTC m=+5378.063097933" watchObservedRunningTime="2026-01-27 16:36:32.083633477 +0000 UTC m=+5378.064242575" Jan 27 16:36:38 crc kubenswrapper[4772]: I0127 16:36:38.430300 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:38 crc kubenswrapper[4772]: I0127 16:36:38.430812 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:38 crc kubenswrapper[4772]: I0127 16:36:38.485180 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:39 crc kubenswrapper[4772]: I0127 16:36:39.156885 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:39 crc kubenswrapper[4772]: I0127 16:36:39.209413 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-74dnd"] Jan 27 16:36:39 crc kubenswrapper[4772]: I0127 16:36:39.908115 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-49fxh"] Jan 27 16:36:39 crc kubenswrapper[4772]: I0127 16:36:39.910110 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:39 crc kubenswrapper[4772]: I0127 16:36:39.917743 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-49fxh"] Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.013747 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fbd6-account-create-update-kvjsf"] Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.014953 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.024746 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.055444 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbd6-account-create-update-kvjsf"] Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.081419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf2sr\" (UniqueName: \"kubernetes.io/projected/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-kube-api-access-hf2sr\") pod \"neutron-db-create-49fxh\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.081510 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-operator-scripts\") pod \"neutron-db-create-49fxh\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.183463 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-operator-scripts\") pod \"neutron-fbd6-account-create-update-kvjsf\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.183724 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf2sr\" (UniqueName: \"kubernetes.io/projected/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-kube-api-access-hf2sr\") pod \"neutron-db-create-49fxh\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.183822 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-operator-scripts\") pod \"neutron-db-create-49fxh\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.184011 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkkbh\" (UniqueName: \"kubernetes.io/projected/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-kube-api-access-wkkbh\") pod \"neutron-fbd6-account-create-update-kvjsf\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.184680 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-operator-scripts\") pod \"neutron-db-create-49fxh\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.207807 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf2sr\" (UniqueName: \"kubernetes.io/projected/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-kube-api-access-hf2sr\") pod \"neutron-db-create-49fxh\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.227096 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.285624 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-operator-scripts\") pod \"neutron-fbd6-account-create-update-kvjsf\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.285787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkkbh\" (UniqueName: \"kubernetes.io/projected/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-kube-api-access-wkkbh\") pod \"neutron-fbd6-account-create-update-kvjsf\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.286649 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-operator-scripts\") pod \"neutron-fbd6-account-create-update-kvjsf\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.309600 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkkbh\" (UniqueName: \"kubernetes.io/projected/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-kube-api-access-wkkbh\") pod \"neutron-fbd6-account-create-update-kvjsf\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.346383 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.677123 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-49fxh"] Jan 27 16:36:40 crc kubenswrapper[4772]: I0127 16:36:40.800019 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fbd6-account-create-update-kvjsf"] Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.134519 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbd6-account-create-update-kvjsf" event={"ID":"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b","Type":"ContainerStarted","Data":"7820202ed4fe89d2fb42f752ab53ea23a8554f7e3772282ffa645c3878d8acde"} Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.134570 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbd6-account-create-update-kvjsf" event={"ID":"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b","Type":"ContainerStarted","Data":"98049675b35bf9aa5f55e27edcefac02abeb383c73491edfaf8dc4f2d0cc9061"} Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.135873 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-49fxh" event={"ID":"6dd740e1-b6ee-444d-b71e-e18d4837ef8a","Type":"ContainerStarted","Data":"0eb151bf9a1bfafe986b27f8d85f2a3fcbd2a1f6be731bcd2ce95359e6a6e136"} Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.135902 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-49fxh" event={"ID":"6dd740e1-b6ee-444d-b71e-e18d4837ef8a","Type":"ContainerStarted","Data":"63032a2c7b9511cf6dd4c145bad9bed20775a86fc21f664cdf358270e2e03ec9"} Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.136216 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-74dnd" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="registry-server" containerID="cri-o://ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168" gracePeriod=2 Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.147347 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fbd6-account-create-update-kvjsf" podStartSLOduration=2.147329898 podStartE2EDuration="2.147329898s" podCreationTimestamp="2026-01-27 16:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:41.147213485 +0000 UTC m=+5387.127822593" watchObservedRunningTime="2026-01-27 16:36:41.147329898 +0000 UTC m=+5387.127938996" Jan 27 16:36:41 crc kubenswrapper[4772]: I0127 16:36:41.169069 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-49fxh" podStartSLOduration=2.169050107 podStartE2EDuration="2.169050107s" podCreationTimestamp="2026-01-27 16:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:41.163271212 +0000 UTC m=+5387.143880300" watchObservedRunningTime="2026-01-27 16:36:41.169050107 +0000 UTC m=+5387.149659205" Jan 27 16:36:42 crc kubenswrapper[4772]: I0127 16:36:42.842074 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:42 crc kubenswrapper[4772]: I0127 16:36:42.926843 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-utilities\") pod \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " Jan 27 16:36:42 crc kubenswrapper[4772]: I0127 16:36:42.927199 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-catalog-content\") pod \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " Jan 27 16:36:42 crc kubenswrapper[4772]: I0127 16:36:42.927309 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr8tc\" (UniqueName: \"kubernetes.io/projected/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-kube-api-access-cr8tc\") pod \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\" (UID: \"a3141ae0-d9c6-4eb9-8ada-8dd2454da297\") " Jan 27 16:36:42 crc kubenswrapper[4772]: I0127 16:36:42.927891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-utilities" (OuterVolumeSpecName: "utilities") pod "a3141ae0-d9c6-4eb9-8ada-8dd2454da297" (UID: "a3141ae0-d9c6-4eb9-8ada-8dd2454da297"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:36:42 crc kubenswrapper[4772]: I0127 16:36:42.934629 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-kube-api-access-cr8tc" (OuterVolumeSpecName: "kube-api-access-cr8tc") pod "a3141ae0-d9c6-4eb9-8ada-8dd2454da297" (UID: "a3141ae0-d9c6-4eb9-8ada-8dd2454da297"). InnerVolumeSpecName "kube-api-access-cr8tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.029782 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.029825 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cr8tc\" (UniqueName: \"kubernetes.io/projected/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-kube-api-access-cr8tc\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.041529 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3141ae0-d9c6-4eb9-8ada-8dd2454da297" (UID: "a3141ae0-d9c6-4eb9-8ada-8dd2454da297"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.131716 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3141ae0-d9c6-4eb9-8ada-8dd2454da297-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.156070 4772 generic.go:334] "Generic (PLEG): container finished" podID="0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" containerID="7820202ed4fe89d2fb42f752ab53ea23a8554f7e3772282ffa645c3878d8acde" exitCode=0 Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.156143 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbd6-account-create-update-kvjsf" event={"ID":"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b","Type":"ContainerDied","Data":"7820202ed4fe89d2fb42f752ab53ea23a8554f7e3772282ffa645c3878d8acde"} Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.159306 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-74dnd" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.159348 4772 generic.go:334] "Generic (PLEG): container finished" podID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerID="ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168" exitCode=0 Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.159425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerDied","Data":"ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168"} Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.159570 4772 scope.go:117] "RemoveContainer" containerID="ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.159454 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-74dnd" event={"ID":"a3141ae0-d9c6-4eb9-8ada-8dd2454da297","Type":"ContainerDied","Data":"edeeb734fb1a3ced8ebfb84b4f3f42b1d212bd81615ff209859424b50ce18695"} Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.161299 4772 generic.go:334] "Generic (PLEG): container finished" podID="6dd740e1-b6ee-444d-b71e-e18d4837ef8a" containerID="0eb151bf9a1bfafe986b27f8d85f2a3fcbd2a1f6be731bcd2ce95359e6a6e136" exitCode=0 Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.161332 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-49fxh" event={"ID":"6dd740e1-b6ee-444d-b71e-e18d4837ef8a","Type":"ContainerDied","Data":"0eb151bf9a1bfafe986b27f8d85f2a3fcbd2a1f6be731bcd2ce95359e6a6e136"} Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.185598 4772 scope.go:117] "RemoveContainer" containerID="b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.203118 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-74dnd"] Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.213696 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-74dnd"] Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.221612 4772 scope.go:117] "RemoveContainer" containerID="4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.246805 4772 scope.go:117] "RemoveContainer" containerID="ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168" Jan 27 16:36:43 crc kubenswrapper[4772]: E0127 16:36:43.247280 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168\": container with ID starting with ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168 not found: ID does not exist" containerID="ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.247343 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168"} err="failed to get container status \"ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168\": rpc error: code = NotFound desc = could not find container \"ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168\": container with ID starting with ce0e7196a0a64497f42a021e97d6f099be19a6b7a8db931509a25e0302440168 not found: ID does not exist" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.247372 4772 scope.go:117] "RemoveContainer" containerID="b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0" Jan 27 16:36:43 crc kubenswrapper[4772]: E0127 16:36:43.247692 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0\": container with ID starting with b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0 not found: ID does not exist" containerID="b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.247747 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0"} err="failed to get container status \"b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0\": rpc error: code = NotFound desc = could not find container \"b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0\": container with ID starting with b27ff19e88ae10072e54a035b5eeff3472d319b04facf31b91170a2a03f287d0 not found: ID does not exist" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.247781 4772 scope.go:117] "RemoveContainer" containerID="4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333" Jan 27 16:36:43 crc kubenswrapper[4772]: E0127 16:36:43.248089 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333\": container with ID starting with 4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333 not found: ID does not exist" containerID="4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333" Jan 27 16:36:43 crc kubenswrapper[4772]: I0127 16:36:43.248132 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333"} err="failed to get container status \"4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333\": rpc error: code = NotFound desc = could not find container \"4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333\": container with ID starting with 4823b71d7a398dbe52d74d93991da82e8680383fcd12420daf8b869332d53333 not found: ID does not exist" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.050194 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bxrqf"] Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.057415 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bxrqf"] Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.522801 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.528199 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.660076 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-operator-scripts\") pod \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.660223 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkkbh\" (UniqueName: \"kubernetes.io/projected/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-kube-api-access-wkkbh\") pod \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.660322 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf2sr\" (UniqueName: \"kubernetes.io/projected/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-kube-api-access-hf2sr\") pod \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\" (UID: \"6dd740e1-b6ee-444d-b71e-e18d4837ef8a\") " Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.660399 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-operator-scripts\") pod \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\" (UID: \"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b\") " Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.660997 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6dd740e1-b6ee-444d-b71e-e18d4837ef8a" (UID: "6dd740e1-b6ee-444d-b71e-e18d4837ef8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.661054 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" (UID: "0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.670448 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-kube-api-access-wkkbh" (OuterVolumeSpecName: "kube-api-access-wkkbh") pod "0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" (UID: "0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b"). InnerVolumeSpecName "kube-api-access-wkkbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.670498 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-kube-api-access-hf2sr" (OuterVolumeSpecName: "kube-api-access-hf2sr") pod "6dd740e1-b6ee-444d-b71e-e18d4837ef8a" (UID: "6dd740e1-b6ee-444d-b71e-e18d4837ef8a"). InnerVolumeSpecName "kube-api-access-hf2sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.674622 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" path="/var/lib/kubelet/pods/a3141ae0-d9c6-4eb9-8ada-8dd2454da297/volumes" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.675499 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb558533-27c2-4249-9beb-e01d5b918c58" path="/var/lib/kubelet/pods/cb558533-27c2-4249-9beb-e01d5b918c58/volumes" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.762568 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.762598 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.762609 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkkbh\" (UniqueName: \"kubernetes.io/projected/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b-kube-api-access-wkkbh\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:44 crc kubenswrapper[4772]: I0127 16:36:44.762618 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf2sr\" (UniqueName: \"kubernetes.io/projected/6dd740e1-b6ee-444d-b71e-e18d4837ef8a-kube-api-access-hf2sr\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:45 crc kubenswrapper[4772]: I0127 16:36:45.179404 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fbd6-account-create-update-kvjsf" Jan 27 16:36:45 crc kubenswrapper[4772]: I0127 16:36:45.179402 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fbd6-account-create-update-kvjsf" event={"ID":"0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b","Type":"ContainerDied","Data":"98049675b35bf9aa5f55e27edcefac02abeb383c73491edfaf8dc4f2d0cc9061"} Jan 27 16:36:45 crc kubenswrapper[4772]: I0127 16:36:45.179520 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98049675b35bf9aa5f55e27edcefac02abeb383c73491edfaf8dc4f2d0cc9061" Jan 27 16:36:45 crc kubenswrapper[4772]: I0127 16:36:45.181245 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-49fxh" event={"ID":"6dd740e1-b6ee-444d-b71e-e18d4837ef8a","Type":"ContainerDied","Data":"63032a2c7b9511cf6dd4c145bad9bed20775a86fc21f664cdf358270e2e03ec9"} Jan 27 16:36:45 crc kubenswrapper[4772]: I0127 16:36:45.181275 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63032a2c7b9511cf6dd4c145bad9bed20775a86fc21f664cdf358270e2e03ec9" Jan 27 16:36:45 crc kubenswrapper[4772]: I0127 16:36:45.181307 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-49fxh" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.254801 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-xvkmn"] Jan 27 16:36:50 crc kubenswrapper[4772]: E0127 16:36:50.256708 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="extract-utilities" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.256801 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="extract-utilities" Jan 27 16:36:50 crc kubenswrapper[4772]: E0127 16:36:50.256876 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="extract-content" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.256928 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="extract-content" Jan 27 16:36:50 crc kubenswrapper[4772]: E0127 16:36:50.256978 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="registry-server" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.257027 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="registry-server" Jan 27 16:36:50 crc kubenswrapper[4772]: E0127 16:36:50.257084 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" containerName="mariadb-account-create-update" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.257132 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" containerName="mariadb-account-create-update" Jan 27 16:36:50 crc kubenswrapper[4772]: E0127 16:36:50.257225 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd740e1-b6ee-444d-b71e-e18d4837ef8a" containerName="mariadb-database-create" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.257295 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd740e1-b6ee-444d-b71e-e18d4837ef8a" containerName="mariadb-database-create" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.257491 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" containerName="mariadb-account-create-update" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.257571 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd740e1-b6ee-444d-b71e-e18d4837ef8a" containerName="mariadb-database-create" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.257636 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3141ae0-d9c6-4eb9-8ada-8dd2454da297" containerName="registry-server" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.258277 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.260714 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.260843 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4g84q" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.261747 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.268771 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xvkmn"] Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.353146 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-config\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.353793 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqvpw\" (UniqueName: \"kubernetes.io/projected/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-kube-api-access-tqvpw\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.354082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-combined-ca-bundle\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.455563 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-config\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.455617 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqvpw\" (UniqueName: \"kubernetes.io/projected/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-kube-api-access-tqvpw\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.455750 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-combined-ca-bundle\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.464565 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-combined-ca-bundle\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.466922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-config\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.472585 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqvpw\" (UniqueName: \"kubernetes.io/projected/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-kube-api-access-tqvpw\") pod \"neutron-db-sync-xvkmn\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:50 crc kubenswrapper[4772]: I0127 16:36:50.583680 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:51 crc kubenswrapper[4772]: I0127 16:36:51.014804 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xvkmn"] Jan 27 16:36:51 crc kubenswrapper[4772]: I0127 16:36:51.224982 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xvkmn" event={"ID":"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2","Type":"ContainerStarted","Data":"47404bfd4ce996befd817152654b651725e0f51b6b3b55ee5ee110296d25e0c8"} Jan 27 16:36:51 crc kubenswrapper[4772]: I0127 16:36:51.225037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xvkmn" event={"ID":"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2","Type":"ContainerStarted","Data":"53062de60772780f98beb1805b741d9fbfee4d6d4c0d97ad27dba9fd1047fbfc"} Jan 27 16:36:51 crc kubenswrapper[4772]: I0127 16:36:51.249494 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-xvkmn" podStartSLOduration=1.249475248 podStartE2EDuration="1.249475248s" podCreationTimestamp="2026-01-27 16:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:36:51.24007387 +0000 UTC m=+5397.220682978" watchObservedRunningTime="2026-01-27 16:36:51.249475248 +0000 UTC m=+5397.230084346" Jan 27 16:36:56 crc kubenswrapper[4772]: I0127 16:36:56.276540 4772 generic.go:334] "Generic (PLEG): container finished" podID="fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" containerID="47404bfd4ce996befd817152654b651725e0f51b6b3b55ee5ee110296d25e0c8" exitCode=0 Jan 27 16:36:56 crc kubenswrapper[4772]: I0127 16:36:56.276646 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xvkmn" event={"ID":"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2","Type":"ContainerDied","Data":"47404bfd4ce996befd817152654b651725e0f51b6b3b55ee5ee110296d25e0c8"} Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.620356 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.706904 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-config\") pod \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.707270 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqvpw\" (UniqueName: \"kubernetes.io/projected/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-kube-api-access-tqvpw\") pod \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.707320 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-combined-ca-bundle\") pod \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\" (UID: \"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2\") " Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.712367 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-kube-api-access-tqvpw" (OuterVolumeSpecName: "kube-api-access-tqvpw") pod "fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" (UID: "fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2"). InnerVolumeSpecName "kube-api-access-tqvpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.728468 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" (UID: "fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.733679 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-config" (OuterVolumeSpecName: "config") pod "fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" (UID: "fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.809470 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqvpw\" (UniqueName: \"kubernetes.io/projected/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-kube-api-access-tqvpw\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.809507 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:57 crc kubenswrapper[4772]: I0127 16:36:57.809519 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.292617 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xvkmn" event={"ID":"fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2","Type":"ContainerDied","Data":"53062de60772780f98beb1805b741d9fbfee4d6d4c0d97ad27dba9fd1047fbfc"} Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.292973 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53062de60772780f98beb1805b741d9fbfee4d6d4c0d97ad27dba9fd1047fbfc" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.292687 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xvkmn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.533093 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ccbf7777-mh9xn"] Jan 27 16:36:58 crc kubenswrapper[4772]: E0127 16:36:58.533533 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" containerName="neutron-db-sync" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.533557 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" containerName="neutron-db-sync" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.533805 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" containerName="neutron-db-sync" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.534937 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.547152 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ccbf7777-mh9xn"] Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.626262 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-sb\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.626399 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-nb\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.626438 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl6sk\" (UniqueName: \"kubernetes.io/projected/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-kube-api-access-sl6sk\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.626455 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-config\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.626473 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-dns-svc\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.637378 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8bf5d4b7c-bfg78"] Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.639376 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.645001 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.646453 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4g84q" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.646575 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.676978 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8bf5d4b7c-bfg78"] Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.727901 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rj5m\" (UniqueName: \"kubernetes.io/projected/b5a89957-107d-449b-b438-2215fd4ed522-kube-api-access-8rj5m\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.727990 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-combined-ca-bundle\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-config\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-httpd-config\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728284 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-nb\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728342 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl6sk\" (UniqueName: \"kubernetes.io/projected/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-kube-api-access-sl6sk\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728368 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-config\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728391 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-dns-svc\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.728420 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-sb\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.729322 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-nb\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.729389 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-config\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.729425 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-sb\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.729476 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-dns-svc\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.746936 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl6sk\" (UniqueName: \"kubernetes.io/projected/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-kube-api-access-sl6sk\") pod \"dnsmasq-dns-85ccbf7777-mh9xn\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.831447 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rj5m\" (UniqueName: \"kubernetes.io/projected/b5a89957-107d-449b-b438-2215fd4ed522-kube-api-access-8rj5m\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.831704 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-combined-ca-bundle\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.831816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-config\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.831870 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-httpd-config\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.835884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-config\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.835921 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-combined-ca-bundle\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.840184 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b5a89957-107d-449b-b438-2215fd4ed522-httpd-config\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.850872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rj5m\" (UniqueName: \"kubernetes.io/projected/b5a89957-107d-449b-b438-2215fd4ed522-kube-api-access-8rj5m\") pod \"neutron-8bf5d4b7c-bfg78\" (UID: \"b5a89957-107d-449b-b438-2215fd4ed522\") " pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.864834 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:36:58 crc kubenswrapper[4772]: I0127 16:36:58.958630 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:36:59 crc kubenswrapper[4772]: I0127 16:36:59.374258 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ccbf7777-mh9xn"] Jan 27 16:36:59 crc kubenswrapper[4772]: W0127 16:36:59.381677 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6449727_ae23_4ae0_b6e6_4c1cef43ef53.slice/crio-0286cc31499d6fb5d457987c1e11aad0e7f2bc6e4ccd62d6d0d7e8570d1f3e2e WatchSource:0}: Error finding container 0286cc31499d6fb5d457987c1e11aad0e7f2bc6e4ccd62d6d0d7e8570d1f3e2e: Status 404 returned error can't find the container with id 0286cc31499d6fb5d457987c1e11aad0e7f2bc6e4ccd62d6d0d7e8570d1f3e2e Jan 27 16:36:59 crc kubenswrapper[4772]: I0127 16:36:59.523071 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8bf5d4b7c-bfg78"] Jan 27 16:36:59 crc kubenswrapper[4772]: W0127 16:36:59.525405 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5a89957_107d_449b_b438_2215fd4ed522.slice/crio-b962fcddf62956fb0c0d0b689222c008baa58e3667e11f9643a61f281a7a1dba WatchSource:0}: Error finding container b962fcddf62956fb0c0d0b689222c008baa58e3667e11f9643a61f281a7a1dba: Status 404 returned error can't find the container with id b962fcddf62956fb0c0d0b689222c008baa58e3667e11f9643a61f281a7a1dba Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.306764 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerID="71552776ddecc62c8f2d51f59b6ec4b6ba66501bb9fcfb2256ecea66da9231d2" exitCode=0 Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.306870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" event={"ID":"b6449727-ae23-4ae0-b6e6-4c1cef43ef53","Type":"ContainerDied","Data":"71552776ddecc62c8f2d51f59b6ec4b6ba66501bb9fcfb2256ecea66da9231d2"} Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.307067 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" event={"ID":"b6449727-ae23-4ae0-b6e6-4c1cef43ef53","Type":"ContainerStarted","Data":"0286cc31499d6fb5d457987c1e11aad0e7f2bc6e4ccd62d6d0d7e8570d1f3e2e"} Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.309936 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf5d4b7c-bfg78" event={"ID":"b5a89957-107d-449b-b438-2215fd4ed522","Type":"ContainerStarted","Data":"e7a293400012c2920a2a67640d49f3635b735a4b47247efa0afa1b93866b0de5"} Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.309976 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf5d4b7c-bfg78" event={"ID":"b5a89957-107d-449b-b438-2215fd4ed522","Type":"ContainerStarted","Data":"414006ec43fc2848a611209d333ad38eb189a587359c2196aa3950825367faba"} Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.309987 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8bf5d4b7c-bfg78" event={"ID":"b5a89957-107d-449b-b438-2215fd4ed522","Type":"ContainerStarted","Data":"b962fcddf62956fb0c0d0b689222c008baa58e3667e11f9643a61f281a7a1dba"} Jan 27 16:37:00 crc kubenswrapper[4772]: I0127 16:37:00.310107 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:37:01 crc kubenswrapper[4772]: I0127 16:37:01.323098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" event={"ID":"b6449727-ae23-4ae0-b6e6-4c1cef43ef53","Type":"ContainerStarted","Data":"f3387679ac491bfa95cee52f7df561b5ae6817e29d93d8687efd8b7b43af3938"} Jan 27 16:37:01 crc kubenswrapper[4772]: I0127 16:37:01.323569 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:37:01 crc kubenswrapper[4772]: I0127 16:37:01.352741 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8bf5d4b7c-bfg78" podStartSLOduration=3.352718699 podStartE2EDuration="3.352718699s" podCreationTimestamp="2026-01-27 16:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:00.360755504 +0000 UTC m=+5406.341364602" watchObservedRunningTime="2026-01-27 16:37:01.352718699 +0000 UTC m=+5407.333327797" Jan 27 16:37:06 crc kubenswrapper[4772]: I0127 16:37:06.838784 4772 scope.go:117] "RemoveContainer" containerID="65777ccce3cb931b879ebb264390f1a957ffbebf2e9446690c38c79d1e3ddb7c" Jan 27 16:37:08 crc kubenswrapper[4772]: I0127 16:37:08.867140 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:37:08 crc kubenswrapper[4772]: I0127 16:37:08.898598 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" podStartSLOduration=10.898579426 podStartE2EDuration="10.898579426s" podCreationTimestamp="2026-01-27 16:36:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:01.346933974 +0000 UTC m=+5407.327543072" watchObservedRunningTime="2026-01-27 16:37:08.898579426 +0000 UTC m=+5414.879188524" Jan 27 16:37:08 crc kubenswrapper[4772]: I0127 16:37:08.944731 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d7c89dc7-g6rsn"] Jan 27 16:37:08 crc kubenswrapper[4772]: I0127 16:37:08.945079 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" podUID="53c70df7-cdca-4296-af71-b5b002484575" containerName="dnsmasq-dns" containerID="cri-o://6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70" gracePeriod=10 Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.384746 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.388270 4772 generic.go:334] "Generic (PLEG): container finished" podID="53c70df7-cdca-4296-af71-b5b002484575" containerID="6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70" exitCode=0 Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.388312 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.388315 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" event={"ID":"53c70df7-cdca-4296-af71-b5b002484575","Type":"ContainerDied","Data":"6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70"} Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.388485 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7c89dc7-g6rsn" event={"ID":"53c70df7-cdca-4296-af71-b5b002484575","Type":"ContainerDied","Data":"ab1dfeb7c7875a02f8dfd72be8da51340b47ef61cc5646a975b8f955d01b7ae0"} Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.388530 4772 scope.go:117] "RemoveContainer" containerID="6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.409971 4772 scope.go:117] "RemoveContainer" containerID="d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.461229 4772 scope.go:117] "RemoveContainer" containerID="6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70" Jan 27 16:37:09 crc kubenswrapper[4772]: E0127 16:37:09.465860 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70\": container with ID starting with 6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70 not found: ID does not exist" containerID="6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.465912 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70"} err="failed to get container status \"6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70\": rpc error: code = NotFound desc = could not find container \"6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70\": container with ID starting with 6271d29d3114164d534a9aed9b1a74dcb46dca81dd4e99dc8ee2c848b0770a70 not found: ID does not exist" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.465940 4772 scope.go:117] "RemoveContainer" containerID="d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba" Jan 27 16:37:09 crc kubenswrapper[4772]: E0127 16:37:09.467411 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba\": container with ID starting with d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba not found: ID does not exist" containerID="d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.467467 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba"} err="failed to get container status \"d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba\": rpc error: code = NotFound desc = could not find container \"d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba\": container with ID starting with d7ad019f98bbdfacb03703ed4de276fec18a82aa27109f1c0ec2b92483868eba not found: ID does not exist" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.529894 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-nb\") pod \"53c70df7-cdca-4296-af71-b5b002484575\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.530048 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-sb\") pod \"53c70df7-cdca-4296-af71-b5b002484575\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.530096 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwtxg\" (UniqueName: \"kubernetes.io/projected/53c70df7-cdca-4296-af71-b5b002484575-kube-api-access-qwtxg\") pod \"53c70df7-cdca-4296-af71-b5b002484575\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.530149 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-config\") pod \"53c70df7-cdca-4296-af71-b5b002484575\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.530200 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-dns-svc\") pod \"53c70df7-cdca-4296-af71-b5b002484575\" (UID: \"53c70df7-cdca-4296-af71-b5b002484575\") " Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.539127 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c70df7-cdca-4296-af71-b5b002484575-kube-api-access-qwtxg" (OuterVolumeSpecName: "kube-api-access-qwtxg") pod "53c70df7-cdca-4296-af71-b5b002484575" (UID: "53c70df7-cdca-4296-af71-b5b002484575"). InnerVolumeSpecName "kube-api-access-qwtxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.570895 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53c70df7-cdca-4296-af71-b5b002484575" (UID: "53c70df7-cdca-4296-af71-b5b002484575"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.579904 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53c70df7-cdca-4296-af71-b5b002484575" (UID: "53c70df7-cdca-4296-af71-b5b002484575"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.580673 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-config" (OuterVolumeSpecName: "config") pod "53c70df7-cdca-4296-af71-b5b002484575" (UID: "53c70df7-cdca-4296-af71-b5b002484575"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.583060 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53c70df7-cdca-4296-af71-b5b002484575" (UID: "53c70df7-cdca-4296-af71-b5b002484575"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.632345 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.632377 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.632388 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwtxg\" (UniqueName: \"kubernetes.io/projected/53c70df7-cdca-4296-af71-b5b002484575-kube-api-access-qwtxg\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.632399 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.632409 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53c70df7-cdca-4296-af71-b5b002484575-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.719374 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d7c89dc7-g6rsn"] Jan 27 16:37:09 crc kubenswrapper[4772]: I0127 16:37:09.725592 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d7c89dc7-g6rsn"] Jan 27 16:37:10 crc kubenswrapper[4772]: I0127 16:37:10.680730 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c70df7-cdca-4296-af71-b5b002484575" path="/var/lib/kubelet/pods/53c70df7-cdca-4296-af71-b5b002484575/volumes" Jan 27 16:37:12 crc kubenswrapper[4772]: I0127 16:37:12.058504 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:37:12 crc kubenswrapper[4772]: I0127 16:37:12.058932 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:37:28 crc kubenswrapper[4772]: I0127 16:37:28.968037 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8bf5d4b7c-bfg78" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.082464 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ng4h8"] Jan 27 16:37:36 crc kubenswrapper[4772]: E0127 16:37:36.083472 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c70df7-cdca-4296-af71-b5b002484575" containerName="dnsmasq-dns" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.083490 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c70df7-cdca-4296-af71-b5b002484575" containerName="dnsmasq-dns" Jan 27 16:37:36 crc kubenswrapper[4772]: E0127 16:37:36.083502 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c70df7-cdca-4296-af71-b5b002484575" containerName="init" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.083509 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c70df7-cdca-4296-af71-b5b002484575" containerName="init" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.083709 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c70df7-cdca-4296-af71-b5b002484575" containerName="dnsmasq-dns" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.084467 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.092563 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ng4h8"] Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.169020 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-ab59-account-create-update-p9dsf"] Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.170475 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.174488 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.186085 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ab59-account-create-update-p9dsf"] Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.197017 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzj8\" (UniqueName: \"kubernetes.io/projected/2b150c73-f6eb-4193-81ef-84941ff1abef-kube-api-access-6mzj8\") pod \"glance-db-create-ng4h8\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.197473 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b150c73-f6eb-4193-81ef-84941ff1abef-operator-scripts\") pod \"glance-db-create-ng4h8\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.298939 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b150c73-f6eb-4193-81ef-84941ff1abef-operator-scripts\") pod \"glance-db-create-ng4h8\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.299041 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615446cc-6fba-46f2-aad9-434f11519be9-operator-scripts\") pod \"glance-ab59-account-create-update-p9dsf\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.299078 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzj8\" (UniqueName: \"kubernetes.io/projected/2b150c73-f6eb-4193-81ef-84941ff1abef-kube-api-access-6mzj8\") pod \"glance-db-create-ng4h8\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.299245 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5s7w\" (UniqueName: \"kubernetes.io/projected/615446cc-6fba-46f2-aad9-434f11519be9-kube-api-access-l5s7w\") pod \"glance-ab59-account-create-update-p9dsf\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.299679 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b150c73-f6eb-4193-81ef-84941ff1abef-operator-scripts\") pod \"glance-db-create-ng4h8\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.318303 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzj8\" (UniqueName: \"kubernetes.io/projected/2b150c73-f6eb-4193-81ef-84941ff1abef-kube-api-access-6mzj8\") pod \"glance-db-create-ng4h8\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.401119 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615446cc-6fba-46f2-aad9-434f11519be9-operator-scripts\") pod \"glance-ab59-account-create-update-p9dsf\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.401226 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5s7w\" (UniqueName: \"kubernetes.io/projected/615446cc-6fba-46f2-aad9-434f11519be9-kube-api-access-l5s7w\") pod \"glance-ab59-account-create-update-p9dsf\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.402331 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615446cc-6fba-46f2-aad9-434f11519be9-operator-scripts\") pod \"glance-ab59-account-create-update-p9dsf\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.412819 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.419584 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5s7w\" (UniqueName: \"kubernetes.io/projected/615446cc-6fba-46f2-aad9-434f11519be9-kube-api-access-l5s7w\") pod \"glance-ab59-account-create-update-p9dsf\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.485135 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:36 crc kubenswrapper[4772]: I0127 16:37:36.897781 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ng4h8"] Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.018092 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-ab59-account-create-update-p9dsf"] Jan 27 16:37:37 crc kubenswrapper[4772]: W0127 16:37:37.032976 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615446cc_6fba_46f2_aad9_434f11519be9.slice/crio-f776daf862a05563cb80d9a9a1825c7258a4908f0d54212641976908fafb9c63 WatchSource:0}: Error finding container f776daf862a05563cb80d9a9a1825c7258a4908f0d54212641976908fafb9c63: Status 404 returned error can't find the container with id f776daf862a05563cb80d9a9a1825c7258a4908f0d54212641976908fafb9c63 Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.625935 4772 generic.go:334] "Generic (PLEG): container finished" podID="615446cc-6fba-46f2-aad9-434f11519be9" containerID="04b6871605c5a7ec7b4197615c41edaa0e9453b396fcff037ce590df9243c6a0" exitCode=0 Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.626318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ab59-account-create-update-p9dsf" event={"ID":"615446cc-6fba-46f2-aad9-434f11519be9","Type":"ContainerDied","Data":"04b6871605c5a7ec7b4197615c41edaa0e9453b396fcff037ce590df9243c6a0"} Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.626839 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ab59-account-create-update-p9dsf" event={"ID":"615446cc-6fba-46f2-aad9-434f11519be9","Type":"ContainerStarted","Data":"f776daf862a05563cb80d9a9a1825c7258a4908f0d54212641976908fafb9c63"} Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.629355 4772 generic.go:334] "Generic (PLEG): container finished" podID="2b150c73-f6eb-4193-81ef-84941ff1abef" containerID="39b253bea060fd0ef46113186002866f36924e03fe711afec1f871af51f40edf" exitCode=0 Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.629397 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ng4h8" event={"ID":"2b150c73-f6eb-4193-81ef-84941ff1abef","Type":"ContainerDied","Data":"39b253bea060fd0ef46113186002866f36924e03fe711afec1f871af51f40edf"} Jan 27 16:37:37 crc kubenswrapper[4772]: I0127 16:37:37.629421 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ng4h8" event={"ID":"2b150c73-f6eb-4193-81ef-84941ff1abef","Type":"ContainerStarted","Data":"c88e5ee63c5b143c29f95eb7a9163107e38a5ec7e21af15fb8f049bff95deca3"} Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.006091 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.013100 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.040237 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzj8\" (UniqueName: \"kubernetes.io/projected/2b150c73-f6eb-4193-81ef-84941ff1abef-kube-api-access-6mzj8\") pod \"2b150c73-f6eb-4193-81ef-84941ff1abef\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.040343 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615446cc-6fba-46f2-aad9-434f11519be9-operator-scripts\") pod \"615446cc-6fba-46f2-aad9-434f11519be9\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.040461 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b150c73-f6eb-4193-81ef-84941ff1abef-operator-scripts\") pod \"2b150c73-f6eb-4193-81ef-84941ff1abef\" (UID: \"2b150c73-f6eb-4193-81ef-84941ff1abef\") " Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.040621 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5s7w\" (UniqueName: \"kubernetes.io/projected/615446cc-6fba-46f2-aad9-434f11519be9-kube-api-access-l5s7w\") pod \"615446cc-6fba-46f2-aad9-434f11519be9\" (UID: \"615446cc-6fba-46f2-aad9-434f11519be9\") " Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.043104 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615446cc-6fba-46f2-aad9-434f11519be9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "615446cc-6fba-46f2-aad9-434f11519be9" (UID: "615446cc-6fba-46f2-aad9-434f11519be9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.043968 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b150c73-f6eb-4193-81ef-84941ff1abef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b150c73-f6eb-4193-81ef-84941ff1abef" (UID: "2b150c73-f6eb-4193-81ef-84941ff1abef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.055939 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b150c73-f6eb-4193-81ef-84941ff1abef-kube-api-access-6mzj8" (OuterVolumeSpecName: "kube-api-access-6mzj8") pod "2b150c73-f6eb-4193-81ef-84941ff1abef" (UID: "2b150c73-f6eb-4193-81ef-84941ff1abef"). InnerVolumeSpecName "kube-api-access-6mzj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.058868 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615446cc-6fba-46f2-aad9-434f11519be9-kube-api-access-l5s7w" (OuterVolumeSpecName: "kube-api-access-l5s7w") pod "615446cc-6fba-46f2-aad9-434f11519be9" (UID: "615446cc-6fba-46f2-aad9-434f11519be9"). InnerVolumeSpecName "kube-api-access-l5s7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.142935 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b150c73-f6eb-4193-81ef-84941ff1abef-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.142975 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5s7w\" (UniqueName: \"kubernetes.io/projected/615446cc-6fba-46f2-aad9-434f11519be9-kube-api-access-l5s7w\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.142992 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mzj8\" (UniqueName: \"kubernetes.io/projected/2b150c73-f6eb-4193-81ef-84941ff1abef-kube-api-access-6mzj8\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.143000 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/615446cc-6fba-46f2-aad9-434f11519be9-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.645072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-ab59-account-create-update-p9dsf" event={"ID":"615446cc-6fba-46f2-aad9-434f11519be9","Type":"ContainerDied","Data":"f776daf862a05563cb80d9a9a1825c7258a4908f0d54212641976908fafb9c63"} Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.645119 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f776daf862a05563cb80d9a9a1825c7258a4908f0d54212641976908fafb9c63" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.645095 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-ab59-account-create-update-p9dsf" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.646266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ng4h8" event={"ID":"2b150c73-f6eb-4193-81ef-84941ff1abef","Type":"ContainerDied","Data":"c88e5ee63c5b143c29f95eb7a9163107e38a5ec7e21af15fb8f049bff95deca3"} Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.646329 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ng4h8" Jan 27 16:37:39 crc kubenswrapper[4772]: I0127 16:37:39.646330 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c88e5ee63c5b143c29f95eb7a9163107e38a5ec7e21af15fb8f049bff95deca3" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.429718 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wlbbm"] Jan 27 16:37:41 crc kubenswrapper[4772]: E0127 16:37:41.430363 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b150c73-f6eb-4193-81ef-84941ff1abef" containerName="mariadb-database-create" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.430376 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b150c73-f6eb-4193-81ef-84941ff1abef" containerName="mariadb-database-create" Jan 27 16:37:41 crc kubenswrapper[4772]: E0127 16:37:41.430396 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615446cc-6fba-46f2-aad9-434f11519be9" containerName="mariadb-account-create-update" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.430402 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="615446cc-6fba-46f2-aad9-434f11519be9" containerName="mariadb-account-create-update" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.430574 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b150c73-f6eb-4193-81ef-84941ff1abef" containerName="mariadb-database-create" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.430590 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="615446cc-6fba-46f2-aad9-434f11519be9" containerName="mariadb-account-create-update" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.431317 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.433466 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.434177 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7r5xp" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.438193 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wlbbm"] Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.483789 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-config-data\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.483852 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-combined-ca-bundle\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.483892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4b7k\" (UniqueName: \"kubernetes.io/projected/ac290494-b5ad-4d85-9f14-daf092e3a6ed-kube-api-access-w4b7k\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.483971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-db-sync-config-data\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.585621 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-db-sync-config-data\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.585743 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-config-data\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.585781 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-combined-ca-bundle\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.585810 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4b7k\" (UniqueName: \"kubernetes.io/projected/ac290494-b5ad-4d85-9f14-daf092e3a6ed-kube-api-access-w4b7k\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.592094 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-db-sync-config-data\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.592913 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-config-data\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.596095 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-combined-ca-bundle\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.607017 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4b7k\" (UniqueName: \"kubernetes.io/projected/ac290494-b5ad-4d85-9f14-daf092e3a6ed-kube-api-access-w4b7k\") pod \"glance-db-sync-wlbbm\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:41 crc kubenswrapper[4772]: I0127 16:37:41.757356 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:42 crc kubenswrapper[4772]: I0127 16:37:42.058304 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:37:42 crc kubenswrapper[4772]: I0127 16:37:42.058641 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:37:42 crc kubenswrapper[4772]: I0127 16:37:42.428755 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wlbbm"] Jan 27 16:37:42 crc kubenswrapper[4772]: I0127 16:37:42.673780 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlbbm" event={"ID":"ac290494-b5ad-4d85-9f14-daf092e3a6ed","Type":"ContainerStarted","Data":"e22dc408be2c6bac485917f551c4608cbccf8ac7f37c8b56903873b8fa5f6e6e"} Jan 27 16:37:43 crc kubenswrapper[4772]: I0127 16:37:43.677737 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlbbm" event={"ID":"ac290494-b5ad-4d85-9f14-daf092e3a6ed","Type":"ContainerStarted","Data":"972404acaa773e237766879c66e6f19a8b3951bb52780f66333fe8405eb0ccb2"} Jan 27 16:37:46 crc kubenswrapper[4772]: I0127 16:37:46.716573 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac290494-b5ad-4d85-9f14-daf092e3a6ed" containerID="972404acaa773e237766879c66e6f19a8b3951bb52780f66333fe8405eb0ccb2" exitCode=0 Jan 27 16:37:46 crc kubenswrapper[4772]: I0127 16:37:46.716660 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlbbm" event={"ID":"ac290494-b5ad-4d85-9f14-daf092e3a6ed","Type":"ContainerDied","Data":"972404acaa773e237766879c66e6f19a8b3951bb52780f66333fe8405eb0ccb2"} Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.097885 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.289570 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-config-data\") pod \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.289945 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4b7k\" (UniqueName: \"kubernetes.io/projected/ac290494-b5ad-4d85-9f14-daf092e3a6ed-kube-api-access-w4b7k\") pod \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.289989 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-combined-ca-bundle\") pod \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.290092 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-db-sync-config-data\") pod \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\" (UID: \"ac290494-b5ad-4d85-9f14-daf092e3a6ed\") " Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.295591 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ac290494-b5ad-4d85-9f14-daf092e3a6ed" (UID: "ac290494-b5ad-4d85-9f14-daf092e3a6ed"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.301870 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac290494-b5ad-4d85-9f14-daf092e3a6ed-kube-api-access-w4b7k" (OuterVolumeSpecName: "kube-api-access-w4b7k") pod "ac290494-b5ad-4d85-9f14-daf092e3a6ed" (UID: "ac290494-b5ad-4d85-9f14-daf092e3a6ed"). InnerVolumeSpecName "kube-api-access-w4b7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.312877 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac290494-b5ad-4d85-9f14-daf092e3a6ed" (UID: "ac290494-b5ad-4d85-9f14-daf092e3a6ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.335203 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-config-data" (OuterVolumeSpecName: "config-data") pod "ac290494-b5ad-4d85-9f14-daf092e3a6ed" (UID: "ac290494-b5ad-4d85-9f14-daf092e3a6ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.392271 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.392307 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4b7k\" (UniqueName: \"kubernetes.io/projected/ac290494-b5ad-4d85-9f14-daf092e3a6ed-kube-api-access-w4b7k\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.392321 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.392335 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac290494-b5ad-4d85-9f14-daf092e3a6ed-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.749734 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wlbbm" event={"ID":"ac290494-b5ad-4d85-9f14-daf092e3a6ed","Type":"ContainerDied","Data":"e22dc408be2c6bac485917f551c4608cbccf8ac7f37c8b56903873b8fa5f6e6e"} Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.749782 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e22dc408be2c6bac485917f551c4608cbccf8ac7f37c8b56903873b8fa5f6e6e" Jan 27 16:37:48 crc kubenswrapper[4772]: I0127 16:37:48.749831 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wlbbm" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.025755 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:49 crc kubenswrapper[4772]: E0127 16:37:49.026080 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac290494-b5ad-4d85-9f14-daf092e3a6ed" containerName="glance-db-sync" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.026093 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac290494-b5ad-4d85-9f14-daf092e3a6ed" containerName="glance-db-sync" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.026289 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac290494-b5ad-4d85-9f14-daf092e3a6ed" containerName="glance-db-sync" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.028380 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.030601 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.030912 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7r5xp" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.031083 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.033976 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.045577 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.142097 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d645dd9d5-2pwb9"] Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.152747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.161065 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d645dd9d5-2pwb9"] Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206749 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-logs\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206797 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206848 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-ceph\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206867 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206885 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwn2d\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-kube-api-access-jwn2d\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.206984 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.280454 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.282143 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.284289 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.298146 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.308855 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-dns-svc\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.308910 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-config\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.308949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.308976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-nb\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309020 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7grf\" (UniqueName: \"kubernetes.io/projected/e289d3f6-26ba-4306-a7f0-bf95513c9068-kube-api-access-q7grf\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309044 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-logs\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309115 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309153 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-sb\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309264 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-ceph\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309293 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwn2d\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-kube-api-access-jwn2d\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.309993 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-logs\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.310294 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.314230 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-ceph\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.318916 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-scripts\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.320273 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-config-data\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.325818 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.339233 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwn2d\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-kube-api-access-jwn2d\") pod \"glance-default-external-api-0\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.352627 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.410798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-dns-svc\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411075 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-config\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411144 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411186 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-nb\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411234 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7grf\" (UniqueName: \"kubernetes.io/projected/e289d3f6-26ba-4306-a7f0-bf95513c9068-kube-api-access-q7grf\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411260 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411333 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-sb\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411402 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411439 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wpmx\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-kube-api-access-7wpmx\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411476 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-logs\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.411675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-dns-svc\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.412283 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-config\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.412655 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-nb\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.412914 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-sb\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.440671 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7grf\" (UniqueName: \"kubernetes.io/projected/e289d3f6-26ba-4306-a7f0-bf95513c9068-kube-api-access-q7grf\") pod \"dnsmasq-dns-6d645dd9d5-2pwb9\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.472593 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513490 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513539 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513580 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wpmx\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-kube-api-access-7wpmx\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513621 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-logs\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513683 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513722 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.513766 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.514474 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-logs\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.514716 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.519667 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.522150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.535819 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.538784 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.538926 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wpmx\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-kube-api-access-7wpmx\") pod \"glance-default-internal-api-0\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.599382 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 16:37:49 crc kubenswrapper[4772]: I0127 16:37:49.969471 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.183077 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d645dd9d5-2pwb9"] Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.423660 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.750006 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.850499 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04330651-2770-404a-a5ff-66c7ce91b3e7","Type":"ContainerStarted","Data":"e9209abb013b605df859331d891ccbf91ecc4ce083f4e3c8a98e078222c67e1b"} Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.851874 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cf8dce7-53a4-4a54-baad-3787346773ae","Type":"ContainerStarted","Data":"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56"} Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.851901 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cf8dce7-53a4-4a54-baad-3787346773ae","Type":"ContainerStarted","Data":"516d152c267168193958fb213e9f1682cf2f22937a080abe30eadfb6c7582832"} Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.853564 4772 generic.go:334] "Generic (PLEG): container finished" podID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerID="cfd61d60b4611b222a3dcb5c92c6682ed0c719a75f85f6d5730b053534f9b1b0" exitCode=0 Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.853605 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" event={"ID":"e289d3f6-26ba-4306-a7f0-bf95513c9068","Type":"ContainerDied","Data":"cfd61d60b4611b222a3dcb5c92c6682ed0c719a75f85f6d5730b053534f9b1b0"} Jan 27 16:37:50 crc kubenswrapper[4772]: I0127 16:37:50.853620 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" event={"ID":"e289d3f6-26ba-4306-a7f0-bf95513c9068","Type":"ContainerStarted","Data":"5dc3b25da49a88c2e2d810e005946006b5e97f2c6fb476e48c50681bedaf4daf"} Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.863692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04330651-2770-404a-a5ff-66c7ce91b3e7","Type":"ContainerStarted","Data":"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0"} Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.864325 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04330651-2770-404a-a5ff-66c7ce91b3e7","Type":"ContainerStarted","Data":"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889"} Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.865634 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cf8dce7-53a4-4a54-baad-3787346773ae","Type":"ContainerStarted","Data":"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c"} Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.865916 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-log" containerID="cri-o://a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56" gracePeriod=30 Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.865956 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-httpd" containerID="cri-o://12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c" gracePeriod=30 Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.867790 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" event={"ID":"e289d3f6-26ba-4306-a7f0-bf95513c9068","Type":"ContainerStarted","Data":"de4a1855c8f97732a1ffd2229841c1e180e5f299f24cbbca78a154e8db2ccc0d"} Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.868245 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.890134 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.890113881 podStartE2EDuration="2.890113881s" podCreationTimestamp="2026-01-27 16:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:51.884571633 +0000 UTC m=+5457.865180741" watchObservedRunningTime="2026-01-27 16:37:51.890113881 +0000 UTC m=+5457.870722969" Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.915451 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" podStartSLOduration=2.915427912 podStartE2EDuration="2.915427912s" podCreationTimestamp="2026-01-27 16:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:51.908126344 +0000 UTC m=+5457.888735462" watchObservedRunningTime="2026-01-27 16:37:51.915427912 +0000 UTC m=+5457.896037010" Jan 27 16:37:51 crc kubenswrapper[4772]: I0127 16:37:51.930055 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.930021258 podStartE2EDuration="3.930021258s" podCreationTimestamp="2026-01-27 16:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:51.929944726 +0000 UTC m=+5457.910553854" watchObservedRunningTime="2026-01-27 16:37:51.930021258 +0000 UTC m=+5457.910630356" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.484909 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.647803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-logs\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.647857 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-config-data\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.647949 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-httpd-run\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.647990 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-scripts\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.648014 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-ceph\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.648058 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwn2d\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-kube-api-access-jwn2d\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.648110 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-combined-ca-bundle\") pod \"7cf8dce7-53a4-4a54-baad-3787346773ae\" (UID: \"7cf8dce7-53a4-4a54-baad-3787346773ae\") " Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.648575 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.648681 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-logs" (OuterVolumeSpecName: "logs") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.649282 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.649309 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cf8dce7-53a4-4a54-baad-3787346773ae-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.655036 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-ceph" (OuterVolumeSpecName: "ceph") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.655178 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-scripts" (OuterVolumeSpecName: "scripts") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.656373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-kube-api-access-jwn2d" (OuterVolumeSpecName: "kube-api-access-jwn2d") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "kube-api-access-jwn2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.689304 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.705065 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-config-data" (OuterVolumeSpecName: "config-data") pod "7cf8dce7-53a4-4a54-baad-3787346773ae" (UID: "7cf8dce7-53a4-4a54-baad-3787346773ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.750093 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.750137 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-ceph\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.750153 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwn2d\" (UniqueName: \"kubernetes.io/projected/7cf8dce7-53a4-4a54-baad-3787346773ae-kube-api-access-jwn2d\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.750185 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.750199 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cf8dce7-53a4-4a54-baad-3787346773ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.755977 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888037 4772 generic.go:334] "Generic (PLEG): container finished" podID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerID="12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c" exitCode=0 Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888070 4772 generic.go:334] "Generic (PLEG): container finished" podID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerID="a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56" exitCode=143 Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888090 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888144 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cf8dce7-53a4-4a54-baad-3787346773ae","Type":"ContainerDied","Data":"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c"} Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888212 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cf8dce7-53a4-4a54-baad-3787346773ae","Type":"ContainerDied","Data":"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56"} Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888227 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7cf8dce7-53a4-4a54-baad-3787346773ae","Type":"ContainerDied","Data":"516d152c267168193958fb213e9f1682cf2f22937a080abe30eadfb6c7582832"} Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.888246 4772 scope.go:117] "RemoveContainer" containerID="12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.923010 4772 scope.go:117] "RemoveContainer" containerID="a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.925138 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.942370 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.953012 4772 scope.go:117] "RemoveContainer" containerID="12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c" Jan 27 16:37:52 crc kubenswrapper[4772]: E0127 16:37:52.953976 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c\": container with ID starting with 12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c not found: ID does not exist" containerID="12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.954003 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c"} err="failed to get container status \"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c\": rpc error: code = NotFound desc = could not find container \"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c\": container with ID starting with 12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c not found: ID does not exist" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.954022 4772 scope.go:117] "RemoveContainer" containerID="a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.955092 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:52 crc kubenswrapper[4772]: E0127 16:37:52.955483 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-httpd" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.955496 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-httpd" Jan 27 16:37:52 crc kubenswrapper[4772]: E0127 16:37:52.955517 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-log" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.955524 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-log" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.955678 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-httpd" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.955698 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" containerName="glance-log" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.958065 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 16:37:52 crc kubenswrapper[4772]: E0127 16:37:52.962902 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56\": container with ID starting with a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56 not found: ID does not exist" containerID="a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.983782 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56"} err="failed to get container status \"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56\": rpc error: code = NotFound desc = could not find container \"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56\": container with ID starting with a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56 not found: ID does not exist" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.992039 4772 scope.go:117] "RemoveContainer" containerID="12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.994236 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c"} err="failed to get container status \"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c\": rpc error: code = NotFound desc = could not find container \"12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c\": container with ID starting with 12839ace5fe2bccfe75af24d40677d5809cc38fc3bfddb8ee3117ee32088743c not found: ID does not exist" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.994284 4772 scope.go:117] "RemoveContainer" containerID="a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.995894 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.999015 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56"} err="failed to get container status \"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56\": rpc error: code = NotFound desc = could not find container \"a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56\": container with ID starting with a2465bbbe9a52b9b42ecc99a8f99f0479ff0f8665e48c3d8d198379279a15b56 not found: ID does not exist" Jan 27 16:37:52 crc kubenswrapper[4772]: I0127 16:37:52.999853 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.156872 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7917063-9e04-41e8-8fb9-e8383f839bd6-ceph\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.157047 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.157094 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.157139 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl9ff\" (UniqueName: \"kubernetes.io/projected/a7917063-9e04-41e8-8fb9-e8383f839bd6-kube-api-access-bl9ff\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.157278 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7917063-9e04-41e8-8fb9-e8383f839bd6-logs\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.157351 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7917063-9e04-41e8-8fb9-e8383f839bd6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.157428 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258483 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl9ff\" (UniqueName: \"kubernetes.io/projected/a7917063-9e04-41e8-8fb9-e8383f839bd6-kube-api-access-bl9ff\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258554 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7917063-9e04-41e8-8fb9-e8383f839bd6-logs\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258598 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7917063-9e04-41e8-8fb9-e8383f839bd6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258703 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7917063-9e04-41e8-8fb9-e8383f839bd6-ceph\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258777 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.258806 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.259064 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7917063-9e04-41e8-8fb9-e8383f839bd6-logs\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.261205 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7917063-9e04-41e8-8fb9-e8383f839bd6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.264847 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7917063-9e04-41e8-8fb9-e8383f839bd6-ceph\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.264880 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.265534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.269410 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7917063-9e04-41e8-8fb9-e8383f839bd6-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.278159 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl9ff\" (UniqueName: \"kubernetes.io/projected/a7917063-9e04-41e8-8fb9-e8383f839bd6-kube-api-access-bl9ff\") pod \"glance-default-external-api-0\" (UID: \"a7917063-9e04-41e8-8fb9-e8383f839bd6\") " pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.297962 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.881523 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 16:37:53 crc kubenswrapper[4772]: W0127 16:37:53.885038 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7917063_9e04_41e8_8fb9_e8383f839bd6.slice/crio-3ebb35abcf6caaaad30b633f23eadd07d995c99ccfbf81457ba471e6fe0cbc23 WatchSource:0}: Error finding container 3ebb35abcf6caaaad30b633f23eadd07d995c99ccfbf81457ba471e6fe0cbc23: Status 404 returned error can't find the container with id 3ebb35abcf6caaaad30b633f23eadd07d995c99ccfbf81457ba471e6fe0cbc23 Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.900280 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7917063-9e04-41e8-8fb9-e8383f839bd6","Type":"ContainerStarted","Data":"3ebb35abcf6caaaad30b633f23eadd07d995c99ccfbf81457ba471e6fe0cbc23"} Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.900658 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-httpd" containerID="cri-o://91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0" gracePeriod=30 Jan 27 16:37:53 crc kubenswrapper[4772]: I0127 16:37:53.900455 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-log" containerID="cri-o://5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889" gracePeriod=30 Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.552122 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.589815 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-config-data\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.589935 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-scripts\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.590590 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wpmx\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-kube-api-access-7wpmx\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.590637 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-ceph\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.590666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-combined-ca-bundle\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.590700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-httpd-run\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.591094 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-logs\") pod \"04330651-2770-404a-a5ff-66c7ce91b3e7\" (UID: \"04330651-2770-404a-a5ff-66c7ce91b3e7\") " Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.591793 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-logs" (OuterVolumeSpecName: "logs") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.592040 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.593360 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-kube-api-access-7wpmx" (OuterVolumeSpecName: "kube-api-access-7wpmx") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "kube-api-access-7wpmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.597076 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-ceph" (OuterVolumeSpecName: "ceph") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.609222 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-scripts" (OuterVolumeSpecName: "scripts") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.650605 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-config-data" (OuterVolumeSpecName: "config-data") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.652264 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04330651-2770-404a-a5ff-66c7ce91b3e7" (UID: "04330651-2770-404a-a5ff-66c7ce91b3e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.680510 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf8dce7-53a4-4a54-baad-3787346773ae" path="/var/lib/kubelet/pods/7cf8dce7-53a4-4a54-baad-3787346773ae/volumes" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693090 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693127 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693140 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wpmx\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-kube-api-access-7wpmx\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693155 4772 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04330651-2770-404a-a5ff-66c7ce91b3e7-ceph\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693183 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04330651-2770-404a-a5ff-66c7ce91b3e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693197 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.693207 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04330651-2770-404a-a5ff-66c7ce91b3e7-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916019 4772 generic.go:334] "Generic (PLEG): container finished" podID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerID="91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0" exitCode=0 Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916054 4772 generic.go:334] "Generic (PLEG): container finished" podID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerID="5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889" exitCode=143 Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916080 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916103 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04330651-2770-404a-a5ff-66c7ce91b3e7","Type":"ContainerDied","Data":"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0"} Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04330651-2770-404a-a5ff-66c7ce91b3e7","Type":"ContainerDied","Data":"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889"} Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916178 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04330651-2770-404a-a5ff-66c7ce91b3e7","Type":"ContainerDied","Data":"e9209abb013b605df859331d891ccbf91ecc4ce083f4e3c8a98e078222c67e1b"} Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.916206 4772 scope.go:117] "RemoveContainer" containerID="91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.919115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7917063-9e04-41e8-8fb9-e8383f839bd6","Type":"ContainerStarted","Data":"1367a12b2c0bd35eb3d620c4a253cd5dac72c6b345a3f6be430247d7c052b1c4"} Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.941297 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.942963 4772 scope.go:117] "RemoveContainer" containerID="5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.958801 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.965070 4772 scope.go:117] "RemoveContainer" containerID="91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0" Jan 27 16:37:54 crc kubenswrapper[4772]: E0127 16:37:54.965658 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0\": container with ID starting with 91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0 not found: ID does not exist" containerID="91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.965707 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0"} err="failed to get container status \"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0\": rpc error: code = NotFound desc = could not find container \"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0\": container with ID starting with 91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0 not found: ID does not exist" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.965740 4772 scope.go:117] "RemoveContainer" containerID="5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889" Jan 27 16:37:54 crc kubenswrapper[4772]: E0127 16:37:54.966002 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889\": container with ID starting with 5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889 not found: ID does not exist" containerID="5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.966025 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889"} err="failed to get container status \"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889\": rpc error: code = NotFound desc = could not find container \"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889\": container with ID starting with 5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889 not found: ID does not exist" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.966041 4772 scope.go:117] "RemoveContainer" containerID="91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.966264 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0"} err="failed to get container status \"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0\": rpc error: code = NotFound desc = could not find container \"91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0\": container with ID starting with 91a840ae8df2a102ddfe506d79b6aa47edece57d2e8cac22e3c8b0eb6eb724a0 not found: ID does not exist" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.966283 4772 scope.go:117] "RemoveContainer" containerID="5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.966477 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889"} err="failed to get container status \"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889\": rpc error: code = NotFound desc = could not find container \"5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889\": container with ID starting with 5ee5dd0b1e52995f524745c5dd7bbd5d75f473825ad1a4ec87c3a7016be76889 not found: ID does not exist" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.968693 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:54 crc kubenswrapper[4772]: E0127 16:37:54.969008 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-httpd" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.969021 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-httpd" Jan 27 16:37:54 crc kubenswrapper[4772]: E0127 16:37:54.969060 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-log" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.969069 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-log" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.969264 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-httpd" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.969280 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" containerName="glance-log" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.970186 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.979128 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 16:37:54 crc kubenswrapper[4772]: I0127 16:37:54.991852 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004005 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004092 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1535f57-0540-45ea-b53c-1b4cac461cf3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88rx\" (UniqueName: \"kubernetes.io/projected/a1535f57-0540-45ea-b53c-1b4cac461cf3-kube-api-access-d88rx\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004305 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004341 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1535f57-0540-45ea-b53c-1b4cac461cf3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.004363 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1535f57-0540-45ea-b53c-1b4cac461cf3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.105566 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1535f57-0540-45ea-b53c-1b4cac461cf3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.105606 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1535f57-0540-45ea-b53c-1b4cac461cf3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.105658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.105692 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1535f57-0540-45ea-b53c-1b4cac461cf3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.105715 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88rx\" (UniqueName: \"kubernetes.io/projected/a1535f57-0540-45ea-b53c-1b4cac461cf3-kube-api-access-d88rx\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.105985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a1535f57-0540-45ea-b53c-1b4cac461cf3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.106073 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1535f57-0540-45ea-b53c-1b4cac461cf3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.106738 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.106811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.111814 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.111854 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.112128 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1535f57-0540-45ea-b53c-1b4cac461cf3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.112146 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a1535f57-0540-45ea-b53c-1b4cac461cf3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.122987 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88rx\" (UniqueName: \"kubernetes.io/projected/a1535f57-0540-45ea-b53c-1b4cac461cf3-kube-api-access-d88rx\") pod \"glance-default-internal-api-0\" (UID: \"a1535f57-0540-45ea-b53c-1b4cac461cf3\") " pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.288370 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.769758 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.942023 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1535f57-0540-45ea-b53c-1b4cac461cf3","Type":"ContainerStarted","Data":"bdd1e6d352ccc9ac313ef81a1a9d4729caed4c9ef3252c7f8ccf561487a8dddf"} Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.951611 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7917063-9e04-41e8-8fb9-e8383f839bd6","Type":"ContainerStarted","Data":"4ef258baec7de4dd434ebb599831e99ae07b1cc3d174d18617f6f85f4baa1b33"} Jan 27 16:37:55 crc kubenswrapper[4772]: I0127 16:37:55.977217 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.977195878 podStartE2EDuration="3.977195878s" podCreationTimestamp="2026-01-27 16:37:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:55.969308824 +0000 UTC m=+5461.949917932" watchObservedRunningTime="2026-01-27 16:37:55.977195878 +0000 UTC m=+5461.957804976" Jan 27 16:37:56 crc kubenswrapper[4772]: I0127 16:37:56.674705 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04330651-2770-404a-a5ff-66c7ce91b3e7" path="/var/lib/kubelet/pods/04330651-2770-404a-a5ff-66c7ce91b3e7/volumes" Jan 27 16:37:56 crc kubenswrapper[4772]: I0127 16:37:56.962994 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1535f57-0540-45ea-b53c-1b4cac461cf3","Type":"ContainerStarted","Data":"59949bbfc428bded0d6c0d47e4e1ba36374017de77611ef43497f757582df6cb"} Jan 27 16:37:56 crc kubenswrapper[4772]: I0127 16:37:56.963047 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a1535f57-0540-45ea-b53c-1b4cac461cf3","Type":"ContainerStarted","Data":"beef63a5f0beb83017ebac7eba7f0abb95542a31b3a6d3d3ab0cffdfa906e99a"} Jan 27 16:37:56 crc kubenswrapper[4772]: I0127 16:37:56.989427 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.989406201 podStartE2EDuration="2.989406201s" podCreationTimestamp="2026-01-27 16:37:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:37:56.988242927 +0000 UTC m=+5462.968852045" watchObservedRunningTime="2026-01-27 16:37:56.989406201 +0000 UTC m=+5462.970015289" Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.474337 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.550578 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ccbf7777-mh9xn"] Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.550861 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerName="dnsmasq-dns" containerID="cri-o://f3387679ac491bfa95cee52f7df561b5ae6817e29d93d8687efd8b7b43af3938" gracePeriod=10 Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.996157 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerID="f3387679ac491bfa95cee52f7df561b5ae6817e29d93d8687efd8b7b43af3938" exitCode=0 Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.996200 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" event={"ID":"b6449727-ae23-4ae0-b6e6-4c1cef43ef53","Type":"ContainerDied","Data":"f3387679ac491bfa95cee52f7df561b5ae6817e29d93d8687efd8b7b43af3938"} Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.996572 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" event={"ID":"b6449727-ae23-4ae0-b6e6-4c1cef43ef53","Type":"ContainerDied","Data":"0286cc31499d6fb5d457987c1e11aad0e7f2bc6e4ccd62d6d0d7e8570d1f3e2e"} Jan 27 16:37:59 crc kubenswrapper[4772]: I0127 16:37:59.996606 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0286cc31499d6fb5d457987c1e11aad0e7f2bc6e4ccd62d6d0d7e8570d1f3e2e" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.047552 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.103441 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-nb\") pod \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.103539 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-dns-svc\") pod \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.103754 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-sb\") pod \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.103816 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl6sk\" (UniqueName: \"kubernetes.io/projected/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-kube-api-access-sl6sk\") pod \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.103838 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-config\") pod \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\" (UID: \"b6449727-ae23-4ae0-b6e6-4c1cef43ef53\") " Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.117965 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-kube-api-access-sl6sk" (OuterVolumeSpecName: "kube-api-access-sl6sk") pod "b6449727-ae23-4ae0-b6e6-4c1cef43ef53" (UID: "b6449727-ae23-4ae0-b6e6-4c1cef43ef53"). InnerVolumeSpecName "kube-api-access-sl6sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.171115 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6449727-ae23-4ae0-b6e6-4c1cef43ef53" (UID: "b6449727-ae23-4ae0-b6e6-4c1cef43ef53"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.171708 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-config" (OuterVolumeSpecName: "config") pod "b6449727-ae23-4ae0-b6e6-4c1cef43ef53" (UID: "b6449727-ae23-4ae0-b6e6-4c1cef43ef53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.174811 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6449727-ae23-4ae0-b6e6-4c1cef43ef53" (UID: "b6449727-ae23-4ae0-b6e6-4c1cef43ef53"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.203947 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6449727-ae23-4ae0-b6e6-4c1cef43ef53" (UID: "b6449727-ae23-4ae0-b6e6-4c1cef43ef53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.206829 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.206889 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl6sk\" (UniqueName: \"kubernetes.io/projected/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-kube-api-access-sl6sk\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.206907 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.206919 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:00 crc kubenswrapper[4772]: I0127 16:38:00.206949 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6449727-ae23-4ae0-b6e6-4c1cef43ef53-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:01 crc kubenswrapper[4772]: I0127 16:38:01.005728 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ccbf7777-mh9xn" Jan 27 16:38:01 crc kubenswrapper[4772]: I0127 16:38:01.027371 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ccbf7777-mh9xn"] Jan 27 16:38:01 crc kubenswrapper[4772]: I0127 16:38:01.034147 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ccbf7777-mh9xn"] Jan 27 16:38:02 crc kubenswrapper[4772]: I0127 16:38:02.672481 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" path="/var/lib/kubelet/pods/b6449727-ae23-4ae0-b6e6-4c1cef43ef53/volumes" Jan 27 16:38:03 crc kubenswrapper[4772]: I0127 16:38:03.298642 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 16:38:03 crc kubenswrapper[4772]: I0127 16:38:03.298690 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 16:38:03 crc kubenswrapper[4772]: I0127 16:38:03.343916 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 16:38:03 crc kubenswrapper[4772]: I0127 16:38:03.343975 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 16:38:04 crc kubenswrapper[4772]: I0127 16:38:04.029404 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 16:38:04 crc kubenswrapper[4772]: I0127 16:38:04.029693 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 16:38:05 crc kubenswrapper[4772]: I0127 16:38:05.289038 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:05 crc kubenswrapper[4772]: I0127 16:38:05.290042 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:05 crc kubenswrapper[4772]: I0127 16:38:05.314223 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:05 crc kubenswrapper[4772]: I0127 16:38:05.337065 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:06 crc kubenswrapper[4772]: I0127 16:38:06.043525 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:06 crc kubenswrapper[4772]: I0127 16:38:06.043565 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:06 crc kubenswrapper[4772]: I0127 16:38:06.226315 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 16:38:06 crc kubenswrapper[4772]: I0127 16:38:06.226453 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 16:38:06 crc kubenswrapper[4772]: I0127 16:38:06.228447 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 16:38:08 crc kubenswrapper[4772]: I0127 16:38:08.118920 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:08 crc kubenswrapper[4772]: I0127 16:38:08.119345 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 16:38:08 crc kubenswrapper[4772]: I0127 16:38:08.372579 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 16:38:12 crc kubenswrapper[4772]: I0127 16:38:12.058350 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:38:12 crc kubenswrapper[4772]: I0127 16:38:12.058902 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:38:12 crc kubenswrapper[4772]: I0127 16:38:12.058953 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:38:12 crc kubenswrapper[4772]: I0127 16:38:12.059695 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90e27c06727cf113f54cd7c0344565bfa447b15cc343fc7033a04f41dddb22f9"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:38:12 crc kubenswrapper[4772]: I0127 16:38:12.059750 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://90e27c06727cf113f54cd7c0344565bfa447b15cc343fc7033a04f41dddb22f9" gracePeriod=600 Jan 27 16:38:13 crc kubenswrapper[4772]: I0127 16:38:13.102726 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="90e27c06727cf113f54cd7c0344565bfa447b15cc343fc7033a04f41dddb22f9" exitCode=0 Jan 27 16:38:13 crc kubenswrapper[4772]: I0127 16:38:13.102902 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"90e27c06727cf113f54cd7c0344565bfa447b15cc343fc7033a04f41dddb22f9"} Jan 27 16:38:13 crc kubenswrapper[4772]: I0127 16:38:13.103330 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69"} Jan 27 16:38:13 crc kubenswrapper[4772]: I0127 16:38:13.103363 4772 scope.go:117] "RemoveContainer" containerID="beb82f81f96be589cf221c90702e405768d59833a36f70e2929085c7b622f86b" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.272305 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wfhq4"] Jan 27 16:38:18 crc kubenswrapper[4772]: E0127 16:38:18.273256 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerName="dnsmasq-dns" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.273272 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerName="dnsmasq-dns" Jan 27 16:38:18 crc kubenswrapper[4772]: E0127 16:38:18.273285 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerName="init" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.273294 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerName="init" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.273458 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6449727-ae23-4ae0-b6e6-4c1cef43ef53" containerName="dnsmasq-dns" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.274084 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.281257 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wfhq4"] Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.377920 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c674-account-create-update-zr9fr"] Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.378961 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.381154 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.390235 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c674-account-create-update-zr9fr"] Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.413075 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzw5\" (UniqueName: \"kubernetes.io/projected/5a6c1c65-36ca-4017-a8e2-5e22a550d601-kube-api-access-lmzw5\") pod \"placement-db-create-wfhq4\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.413232 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6c1c65-36ca-4017-a8e2-5e22a550d601-operator-scripts\") pod \"placement-db-create-wfhq4\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.515254 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9bcr\" (UniqueName: \"kubernetes.io/projected/68c87e84-0237-41a3-b248-59f0e0156b81-kube-api-access-h9bcr\") pod \"placement-c674-account-create-update-zr9fr\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.515587 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6c1c65-36ca-4017-a8e2-5e22a550d601-operator-scripts\") pod \"placement-db-create-wfhq4\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.515646 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c87e84-0237-41a3-b248-59f0e0156b81-operator-scripts\") pod \"placement-c674-account-create-update-zr9fr\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.515809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzw5\" (UniqueName: \"kubernetes.io/projected/5a6c1c65-36ca-4017-a8e2-5e22a550d601-kube-api-access-lmzw5\") pod \"placement-db-create-wfhq4\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.516513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6c1c65-36ca-4017-a8e2-5e22a550d601-operator-scripts\") pod \"placement-db-create-wfhq4\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.541238 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzw5\" (UniqueName: \"kubernetes.io/projected/5a6c1c65-36ca-4017-a8e2-5e22a550d601-kube-api-access-lmzw5\") pod \"placement-db-create-wfhq4\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.594644 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.616949 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9bcr\" (UniqueName: \"kubernetes.io/projected/68c87e84-0237-41a3-b248-59f0e0156b81-kube-api-access-h9bcr\") pod \"placement-c674-account-create-update-zr9fr\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.617034 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c87e84-0237-41a3-b248-59f0e0156b81-operator-scripts\") pod \"placement-c674-account-create-update-zr9fr\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.617735 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c87e84-0237-41a3-b248-59f0e0156b81-operator-scripts\") pod \"placement-c674-account-create-update-zr9fr\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.636478 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9bcr\" (UniqueName: \"kubernetes.io/projected/68c87e84-0237-41a3-b248-59f0e0156b81-kube-api-access-h9bcr\") pod \"placement-c674-account-create-update-zr9fr\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:18 crc kubenswrapper[4772]: I0127 16:38:18.693582 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:19 crc kubenswrapper[4772]: I0127 16:38:19.030762 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wfhq4"] Jan 27 16:38:19 crc kubenswrapper[4772]: I0127 16:38:19.145259 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c674-account-create-update-zr9fr"] Jan 27 16:38:19 crc kubenswrapper[4772]: W0127 16:38:19.148965 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68c87e84_0237_41a3_b248_59f0e0156b81.slice/crio-d877cfb8558a44192f82348a36e723a383dfeb73c183d53028045e743c3eb819 WatchSource:0}: Error finding container d877cfb8558a44192f82348a36e723a383dfeb73c183d53028045e743c3eb819: Status 404 returned error can't find the container with id d877cfb8558a44192f82348a36e723a383dfeb73c183d53028045e743c3eb819 Jan 27 16:38:19 crc kubenswrapper[4772]: I0127 16:38:19.175830 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wfhq4" event={"ID":"5a6c1c65-36ca-4017-a8e2-5e22a550d601","Type":"ContainerStarted","Data":"159f9ff911847523ab0387be1212efb17cae848a6dcdc1e80961565a39d1eac9"} Jan 27 16:38:19 crc kubenswrapper[4772]: I0127 16:38:19.175886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wfhq4" event={"ID":"5a6c1c65-36ca-4017-a8e2-5e22a550d601","Type":"ContainerStarted","Data":"206780997b1e028010072b79de7396fbb1a2e61dfc93554e5d59fbd376679321"} Jan 27 16:38:19 crc kubenswrapper[4772]: I0127 16:38:19.177922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c674-account-create-update-zr9fr" event={"ID":"68c87e84-0237-41a3-b248-59f0e0156b81","Type":"ContainerStarted","Data":"d877cfb8558a44192f82348a36e723a383dfeb73c183d53028045e743c3eb819"} Jan 27 16:38:19 crc kubenswrapper[4772]: I0127 16:38:19.196771 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-wfhq4" podStartSLOduration=1.196750614 podStartE2EDuration="1.196750614s" podCreationTimestamp="2026-01-27 16:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:38:19.189506128 +0000 UTC m=+5485.170115246" watchObservedRunningTime="2026-01-27 16:38:19.196750614 +0000 UTC m=+5485.177359722" Jan 27 16:38:20 crc kubenswrapper[4772]: I0127 16:38:20.186472 4772 generic.go:334] "Generic (PLEG): container finished" podID="5a6c1c65-36ca-4017-a8e2-5e22a550d601" containerID="159f9ff911847523ab0387be1212efb17cae848a6dcdc1e80961565a39d1eac9" exitCode=0 Jan 27 16:38:20 crc kubenswrapper[4772]: I0127 16:38:20.186657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wfhq4" event={"ID":"5a6c1c65-36ca-4017-a8e2-5e22a550d601","Type":"ContainerDied","Data":"159f9ff911847523ab0387be1212efb17cae848a6dcdc1e80961565a39d1eac9"} Jan 27 16:38:20 crc kubenswrapper[4772]: I0127 16:38:20.188735 4772 generic.go:334] "Generic (PLEG): container finished" podID="68c87e84-0237-41a3-b248-59f0e0156b81" containerID="fffc3d88e7cbd76e4f8c55e9d4f80e7dfb6325460b6429ed67f89d060f3c380c" exitCode=0 Jan 27 16:38:20 crc kubenswrapper[4772]: I0127 16:38:20.188790 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c674-account-create-update-zr9fr" event={"ID":"68c87e84-0237-41a3-b248-59f0e0156b81","Type":"ContainerDied","Data":"fffc3d88e7cbd76e4f8c55e9d4f80e7dfb6325460b6429ed67f89d060f3c380c"} Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.561717 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.566459 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.669917 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9bcr\" (UniqueName: \"kubernetes.io/projected/68c87e84-0237-41a3-b248-59f0e0156b81-kube-api-access-h9bcr\") pod \"68c87e84-0237-41a3-b248-59f0e0156b81\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.669980 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6c1c65-36ca-4017-a8e2-5e22a550d601-operator-scripts\") pod \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.670089 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmzw5\" (UniqueName: \"kubernetes.io/projected/5a6c1c65-36ca-4017-a8e2-5e22a550d601-kube-api-access-lmzw5\") pod \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\" (UID: \"5a6c1c65-36ca-4017-a8e2-5e22a550d601\") " Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.670192 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c87e84-0237-41a3-b248-59f0e0156b81-operator-scripts\") pod \"68c87e84-0237-41a3-b248-59f0e0156b81\" (UID: \"68c87e84-0237-41a3-b248-59f0e0156b81\") " Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.671388 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68c87e84-0237-41a3-b248-59f0e0156b81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68c87e84-0237-41a3-b248-59f0e0156b81" (UID: "68c87e84-0237-41a3-b248-59f0e0156b81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.671442 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a6c1c65-36ca-4017-a8e2-5e22a550d601-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5a6c1c65-36ca-4017-a8e2-5e22a550d601" (UID: "5a6c1c65-36ca-4017-a8e2-5e22a550d601"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.676688 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c87e84-0237-41a3-b248-59f0e0156b81-kube-api-access-h9bcr" (OuterVolumeSpecName: "kube-api-access-h9bcr") pod "68c87e84-0237-41a3-b248-59f0e0156b81" (UID: "68c87e84-0237-41a3-b248-59f0e0156b81"). InnerVolumeSpecName "kube-api-access-h9bcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.677091 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6c1c65-36ca-4017-a8e2-5e22a550d601-kube-api-access-lmzw5" (OuterVolumeSpecName: "kube-api-access-lmzw5") pod "5a6c1c65-36ca-4017-a8e2-5e22a550d601" (UID: "5a6c1c65-36ca-4017-a8e2-5e22a550d601"). InnerVolumeSpecName "kube-api-access-lmzw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.772402 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5a6c1c65-36ca-4017-a8e2-5e22a550d601-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.772435 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmzw5\" (UniqueName: \"kubernetes.io/projected/5a6c1c65-36ca-4017-a8e2-5e22a550d601-kube-api-access-lmzw5\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.772444 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68c87e84-0237-41a3-b248-59f0e0156b81-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:21 crc kubenswrapper[4772]: I0127 16:38:21.772453 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9bcr\" (UniqueName: \"kubernetes.io/projected/68c87e84-0237-41a3-b248-59f0e0156b81-kube-api-access-h9bcr\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:22 crc kubenswrapper[4772]: I0127 16:38:22.206965 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c674-account-create-update-zr9fr" event={"ID":"68c87e84-0237-41a3-b248-59f0e0156b81","Type":"ContainerDied","Data":"d877cfb8558a44192f82348a36e723a383dfeb73c183d53028045e743c3eb819"} Jan 27 16:38:22 crc kubenswrapper[4772]: I0127 16:38:22.207320 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d877cfb8558a44192f82348a36e723a383dfeb73c183d53028045e743c3eb819" Jan 27 16:38:22 crc kubenswrapper[4772]: I0127 16:38:22.206979 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c674-account-create-update-zr9fr" Jan 27 16:38:22 crc kubenswrapper[4772]: I0127 16:38:22.208796 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wfhq4" event={"ID":"5a6c1c65-36ca-4017-a8e2-5e22a550d601","Type":"ContainerDied","Data":"206780997b1e028010072b79de7396fbb1a2e61dfc93554e5d59fbd376679321"} Jan 27 16:38:22 crc kubenswrapper[4772]: I0127 16:38:22.208823 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wfhq4" Jan 27 16:38:22 crc kubenswrapper[4772]: I0127 16:38:22.208840 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="206780997b1e028010072b79de7396fbb1a2e61dfc93554e5d59fbd376679321" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.691993 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-545c956d45-h25qs"] Jan 27 16:38:23 crc kubenswrapper[4772]: E0127 16:38:23.692637 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c87e84-0237-41a3-b248-59f0e0156b81" containerName="mariadb-account-create-update" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.692651 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c87e84-0237-41a3-b248-59f0e0156b81" containerName="mariadb-account-create-update" Jan 27 16:38:23 crc kubenswrapper[4772]: E0127 16:38:23.692669 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6c1c65-36ca-4017-a8e2-5e22a550d601" containerName="mariadb-database-create" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.692675 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6c1c65-36ca-4017-a8e2-5e22a550d601" containerName="mariadb-database-create" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.692849 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6c1c65-36ca-4017-a8e2-5e22a550d601" containerName="mariadb-database-create" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.692866 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c87e84-0237-41a3-b248-59f0e0156b81" containerName="mariadb-account-create-update" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.693785 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.721688 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545c956d45-h25qs"] Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.771526 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4pvhb"] Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.772761 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.778619 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.778685 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-h5gqt" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.778700 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.791959 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4pvhb"] Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.815369 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-nb\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.815573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-sb\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.815700 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8vx\" (UniqueName: \"kubernetes.io/projected/6da65d05-29e3-4d97-869f-d3386a45a38e-kube-api-access-sh8vx\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.815759 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-dns-svc\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.816027 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-config\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.917801 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-config\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.917858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-scripts\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.917915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-config-data\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.917934 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-nb\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918234 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-combined-ca-bundle\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918419 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-sb\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918491 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b972e003-d915-4c6e-b84e-00d1f53740c1-logs\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918595 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8vx\" (UniqueName: \"kubernetes.io/projected/6da65d05-29e3-4d97-869f-d3386a45a38e-kube-api-access-sh8vx\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-dns-svc\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918755 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj2tt\" (UniqueName: \"kubernetes.io/projected/b972e003-d915-4c6e-b84e-00d1f53740c1-kube-api-access-dj2tt\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-config\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.918967 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-nb\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.919476 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-sb\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.919817 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-dns-svc\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:23 crc kubenswrapper[4772]: I0127 16:38:23.953821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8vx\" (UniqueName: \"kubernetes.io/projected/6da65d05-29e3-4d97-869f-d3386a45a38e-kube-api-access-sh8vx\") pod \"dnsmasq-dns-545c956d45-h25qs\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.019033 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.020021 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj2tt\" (UniqueName: \"kubernetes.io/projected/b972e003-d915-4c6e-b84e-00d1f53740c1-kube-api-access-dj2tt\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.020107 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-scripts\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.020154 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-config-data\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.020232 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-combined-ca-bundle\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.020282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b972e003-d915-4c6e-b84e-00d1f53740c1-logs\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.020756 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b972e003-d915-4c6e-b84e-00d1f53740c1-logs\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.025090 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-config-data\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.034914 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-combined-ca-bundle\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.042597 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-scripts\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.048724 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj2tt\" (UniqueName: \"kubernetes.io/projected/b972e003-d915-4c6e-b84e-00d1f53740c1-kube-api-access-dj2tt\") pod \"placement-db-sync-4pvhb\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.096236 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p5mjl"] Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.097119 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.098284 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.117424 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5mjl"] Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.225504 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-utilities\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.225576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-catalog-content\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.225614 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfw6t\" (UniqueName: \"kubernetes.io/projected/d7637c27-38bb-4544-a948-040122a7d526-kube-api-access-zfw6t\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.330301 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-utilities\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.330738 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-catalog-content\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.330796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfw6t\" (UniqueName: \"kubernetes.io/projected/d7637c27-38bb-4544-a948-040122a7d526-kube-api-access-zfw6t\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.331008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-utilities\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.331327 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-catalog-content\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.352402 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfw6t\" (UniqueName: \"kubernetes.io/projected/d7637c27-38bb-4544-a948-040122a7d526-kube-api-access-zfw6t\") pod \"community-operators-p5mjl\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.509590 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.642815 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4pvhb"] Jan 27 16:38:24 crc kubenswrapper[4772]: I0127 16:38:24.797527 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-545c956d45-h25qs"] Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.052613 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5mjl"] Jan 27 16:38:25 crc kubenswrapper[4772]: W0127 16:38:25.072819 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7637c27_38bb_4544_a948_040122a7d526.slice/crio-bde9c232190c4ff499588da71f6a6c3fb3565fcb2c79805b8745d1730e782d30 WatchSource:0}: Error finding container bde9c232190c4ff499588da71f6a6c3fb3565fcb2c79805b8745d1730e782d30: Status 404 returned error can't find the container with id bde9c232190c4ff499588da71f6a6c3fb3565fcb2c79805b8745d1730e782d30 Jan 27 16:38:25 crc kubenswrapper[4772]: E0127 16:38:25.079823 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6da65d05_29e3_4d97_869f_d3386a45a38e.slice/crio-b8551d53299337e281772b00e27dddd2da7a39a65b37574b860847ac8a5c90ba.scope\": RecentStats: unable to find data in memory cache]" Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.241570 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4pvhb" event={"ID":"b972e003-d915-4c6e-b84e-00d1f53740c1","Type":"ContainerStarted","Data":"897a6f6215480fb2b302f194b142ae62f3461ba459140cef6dbb5530febc39e7"} Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.241635 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4pvhb" event={"ID":"b972e003-d915-4c6e-b84e-00d1f53740c1","Type":"ContainerStarted","Data":"392d0c24ac730d47db73a8d7dc9563348e7fe293fddf40e791b09a461f9aa862"} Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.251764 4772 generic.go:334] "Generic (PLEG): container finished" podID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerID="b8551d53299337e281772b00e27dddd2da7a39a65b37574b860847ac8a5c90ba" exitCode=0 Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.251820 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545c956d45-h25qs" event={"ID":"6da65d05-29e3-4d97-869f-d3386a45a38e","Type":"ContainerDied","Data":"b8551d53299337e281772b00e27dddd2da7a39a65b37574b860847ac8a5c90ba"} Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.251878 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545c956d45-h25qs" event={"ID":"6da65d05-29e3-4d97-869f-d3386a45a38e","Type":"ContainerStarted","Data":"1b2cc021d3453413ef029602e151c491355c94bca54c6c90c56cd7685dc93518"} Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.256475 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5mjl" event={"ID":"d7637c27-38bb-4544-a948-040122a7d526","Type":"ContainerStarted","Data":"bde9c232190c4ff499588da71f6a6c3fb3565fcb2c79805b8745d1730e782d30"} Jan 27 16:38:25 crc kubenswrapper[4772]: I0127 16:38:25.261025 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4pvhb" podStartSLOduration=2.26100833 podStartE2EDuration="2.26100833s" podCreationTimestamp="2026-01-27 16:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:38:25.25540107 +0000 UTC m=+5491.236010208" watchObservedRunningTime="2026-01-27 16:38:25.26100833 +0000 UTC m=+5491.241617418" Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.270953 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7637c27-38bb-4544-a948-040122a7d526" containerID="5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687" exitCode=0 Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.271019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5mjl" event={"ID":"d7637c27-38bb-4544-a948-040122a7d526","Type":"ContainerDied","Data":"5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687"} Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.275746 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.279034 4772 generic.go:334] "Generic (PLEG): container finished" podID="b972e003-d915-4c6e-b84e-00d1f53740c1" containerID="897a6f6215480fb2b302f194b142ae62f3461ba459140cef6dbb5530febc39e7" exitCode=0 Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.279194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4pvhb" event={"ID":"b972e003-d915-4c6e-b84e-00d1f53740c1","Type":"ContainerDied","Data":"897a6f6215480fb2b302f194b142ae62f3461ba459140cef6dbb5530febc39e7"} Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.281231 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545c956d45-h25qs" event={"ID":"6da65d05-29e3-4d97-869f-d3386a45a38e","Type":"ContainerStarted","Data":"54f0c375865810e4cb819d2019e6c8ab7827175c7d633ed00deb75c9600fb4b9"} Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.281419 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:26 crc kubenswrapper[4772]: I0127 16:38:26.326406 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-545c956d45-h25qs" podStartSLOduration=3.326387646 podStartE2EDuration="3.326387646s" podCreationTimestamp="2026-01-27 16:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:38:26.322799564 +0000 UTC m=+5492.303408682" watchObservedRunningTime="2026-01-27 16:38:26.326387646 +0000 UTC m=+5492.306996744" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.673082 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835041 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-config-data\") pod \"b972e003-d915-4c6e-b84e-00d1f53740c1\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b972e003-d915-4c6e-b84e-00d1f53740c1-logs\") pod \"b972e003-d915-4c6e-b84e-00d1f53740c1\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835225 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj2tt\" (UniqueName: \"kubernetes.io/projected/b972e003-d915-4c6e-b84e-00d1f53740c1-kube-api-access-dj2tt\") pod \"b972e003-d915-4c6e-b84e-00d1f53740c1\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835412 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-combined-ca-bundle\") pod \"b972e003-d915-4c6e-b84e-00d1f53740c1\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835458 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-scripts\") pod \"b972e003-d915-4c6e-b84e-00d1f53740c1\" (UID: \"b972e003-d915-4c6e-b84e-00d1f53740c1\") " Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835510 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b972e003-d915-4c6e-b84e-00d1f53740c1-logs" (OuterVolumeSpecName: "logs") pod "b972e003-d915-4c6e-b84e-00d1f53740c1" (UID: "b972e003-d915-4c6e-b84e-00d1f53740c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.835962 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b972e003-d915-4c6e-b84e-00d1f53740c1-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.846290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b972e003-d915-4c6e-b84e-00d1f53740c1-kube-api-access-dj2tt" (OuterVolumeSpecName: "kube-api-access-dj2tt") pod "b972e003-d915-4c6e-b84e-00d1f53740c1" (UID: "b972e003-d915-4c6e-b84e-00d1f53740c1"). InnerVolumeSpecName "kube-api-access-dj2tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.846363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-scripts" (OuterVolumeSpecName: "scripts") pod "b972e003-d915-4c6e-b84e-00d1f53740c1" (UID: "b972e003-d915-4c6e-b84e-00d1f53740c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.862510 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-config-data" (OuterVolumeSpecName: "config-data") pod "b972e003-d915-4c6e-b84e-00d1f53740c1" (UID: "b972e003-d915-4c6e-b84e-00d1f53740c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.865365 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b972e003-d915-4c6e-b84e-00d1f53740c1" (UID: "b972e003-d915-4c6e-b84e-00d1f53740c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.912895 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6ffd9fc5c6-99g52"] Jan 27 16:38:27 crc kubenswrapper[4772]: E0127 16:38:27.913323 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b972e003-d915-4c6e-b84e-00d1f53740c1" containerName="placement-db-sync" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.913340 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b972e003-d915-4c6e-b84e-00d1f53740c1" containerName="placement-db-sync" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.913707 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b972e003-d915-4c6e-b84e-00d1f53740c1" containerName="placement-db-sync" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.914684 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.929056 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ffd9fc5c6-99g52"] Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.940490 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.940534 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.940547 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj2tt\" (UniqueName: \"kubernetes.io/projected/b972e003-d915-4c6e-b84e-00d1f53740c1-kube-api-access-dj2tt\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:27 crc kubenswrapper[4772]: I0127 16:38:27.940559 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b972e003-d915-4c6e-b84e-00d1f53740c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.041986 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-config-data\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.042044 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8daf690-d375-4d0a-b763-4b610aaeac45-logs\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.042138 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kx9v\" (UniqueName: \"kubernetes.io/projected/c8daf690-d375-4d0a-b763-4b610aaeac45-kube-api-access-7kx9v\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.042496 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-scripts\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.042564 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-combined-ca-bundle\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.144763 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-scripts\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.144876 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-combined-ca-bundle\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.144930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-config-data\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.144961 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8daf690-d375-4d0a-b763-4b610aaeac45-logs\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.144990 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kx9v\" (UniqueName: \"kubernetes.io/projected/c8daf690-d375-4d0a-b763-4b610aaeac45-kube-api-access-7kx9v\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.145557 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8daf690-d375-4d0a-b763-4b610aaeac45-logs\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.148445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-scripts\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.148636 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-config-data\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.148909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8daf690-d375-4d0a-b763-4b610aaeac45-combined-ca-bundle\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.161739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kx9v\" (UniqueName: \"kubernetes.io/projected/c8daf690-d375-4d0a-b763-4b610aaeac45-kube-api-access-7kx9v\") pod \"placement-6ffd9fc5c6-99g52\" (UID: \"c8daf690-d375-4d0a-b763-4b610aaeac45\") " pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.250389 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.298044 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7637c27-38bb-4544-a948-040122a7d526" containerID="73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48" exitCode=0 Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.298115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5mjl" event={"ID":"d7637c27-38bb-4544-a948-040122a7d526","Type":"ContainerDied","Data":"73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48"} Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.302427 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4pvhb" event={"ID":"b972e003-d915-4c6e-b84e-00d1f53740c1","Type":"ContainerDied","Data":"392d0c24ac730d47db73a8d7dc9563348e7fe293fddf40e791b09a461f9aa862"} Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.302464 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="392d0c24ac730d47db73a8d7dc9563348e7fe293fddf40e791b09a461f9aa862" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.302516 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4pvhb" Jan 27 16:38:28 crc kubenswrapper[4772]: I0127 16:38:28.715459 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6ffd9fc5c6-99g52"] Jan 27 16:38:28 crc kubenswrapper[4772]: W0127 16:38:28.723904 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8daf690_d375_4d0a_b763_4b610aaeac45.slice/crio-cede86ef2b62c284020dea92755cd8f1c45ccf853891c9353a1aaaba262f274e WatchSource:0}: Error finding container cede86ef2b62c284020dea92755cd8f1c45ccf853891c9353a1aaaba262f274e: Status 404 returned error can't find the container with id cede86ef2b62c284020dea92755cd8f1c45ccf853891c9353a1aaaba262f274e Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.314628 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ffd9fc5c6-99g52" event={"ID":"c8daf690-d375-4d0a-b763-4b610aaeac45","Type":"ContainerStarted","Data":"a4f7397ce195b3ef684fdb7e4e7bdf987f0aad1b8524c25545d425f4bc248bbd"} Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.314926 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.314940 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ffd9fc5c6-99g52" event={"ID":"c8daf690-d375-4d0a-b763-4b610aaeac45","Type":"ContainerStarted","Data":"2a9baa311ee399537353526ea6225df9a78dbebd8cc28695cf4db8dba9e9bfb8"} Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.314951 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6ffd9fc5c6-99g52" event={"ID":"c8daf690-d375-4d0a-b763-4b610aaeac45","Type":"ContainerStarted","Data":"cede86ef2b62c284020dea92755cd8f1c45ccf853891c9353a1aaaba262f274e"} Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.318114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5mjl" event={"ID":"d7637c27-38bb-4544-a948-040122a7d526","Type":"ContainerStarted","Data":"1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7"} Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.342936 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6ffd9fc5c6-99g52" podStartSLOduration=2.342894988 podStartE2EDuration="2.342894988s" podCreationTimestamp="2026-01-27 16:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:38:29.334486028 +0000 UTC m=+5495.315095146" watchObservedRunningTime="2026-01-27 16:38:29.342894988 +0000 UTC m=+5495.323504086" Jan 27 16:38:29 crc kubenswrapper[4772]: I0127 16:38:29.361923 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p5mjl" podStartSLOduration=2.916434584 podStartE2EDuration="5.36190654s" podCreationTimestamp="2026-01-27 16:38:24 +0000 UTC" firstStartedPulling="2026-01-27 16:38:26.275488445 +0000 UTC m=+5492.256097543" lastFinishedPulling="2026-01-27 16:38:28.720960391 +0000 UTC m=+5494.701569499" observedRunningTime="2026-01-27 16:38:29.359366647 +0000 UTC m=+5495.339975765" watchObservedRunningTime="2026-01-27 16:38:29.36190654 +0000 UTC m=+5495.342515638" Jan 27 16:38:30 crc kubenswrapper[4772]: I0127 16:38:30.335801 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.020377 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.094035 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d645dd9d5-2pwb9"] Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.094342 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="dnsmasq-dns" containerID="cri-o://de4a1855c8f97732a1ffd2229841c1e180e5f299f24cbbca78a154e8db2ccc0d" gracePeriod=10 Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.389090 4772 generic.go:334] "Generic (PLEG): container finished" podID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerID="de4a1855c8f97732a1ffd2229841c1e180e5f299f24cbbca78a154e8db2ccc0d" exitCode=0 Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.389136 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" event={"ID":"e289d3f6-26ba-4306-a7f0-bf95513c9068","Type":"ContainerDied","Data":"de4a1855c8f97732a1ffd2229841c1e180e5f299f24cbbca78a154e8db2ccc0d"} Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.510652 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.510722 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.554717 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.582024 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.766826 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-dns-svc\") pod \"e289d3f6-26ba-4306-a7f0-bf95513c9068\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.766935 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-nb\") pod \"e289d3f6-26ba-4306-a7f0-bf95513c9068\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.766985 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-sb\") pod \"e289d3f6-26ba-4306-a7f0-bf95513c9068\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.767003 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-config\") pod \"e289d3f6-26ba-4306-a7f0-bf95513c9068\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.767027 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7grf\" (UniqueName: \"kubernetes.io/projected/e289d3f6-26ba-4306-a7f0-bf95513c9068-kube-api-access-q7grf\") pod \"e289d3f6-26ba-4306-a7f0-bf95513c9068\" (UID: \"e289d3f6-26ba-4306-a7f0-bf95513c9068\") " Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.774716 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e289d3f6-26ba-4306-a7f0-bf95513c9068-kube-api-access-q7grf" (OuterVolumeSpecName: "kube-api-access-q7grf") pod "e289d3f6-26ba-4306-a7f0-bf95513c9068" (UID: "e289d3f6-26ba-4306-a7f0-bf95513c9068"). InnerVolumeSpecName "kube-api-access-q7grf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.812079 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-config" (OuterVolumeSpecName: "config") pod "e289d3f6-26ba-4306-a7f0-bf95513c9068" (UID: "e289d3f6-26ba-4306-a7f0-bf95513c9068"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.813371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e289d3f6-26ba-4306-a7f0-bf95513c9068" (UID: "e289d3f6-26ba-4306-a7f0-bf95513c9068"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.816226 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e289d3f6-26ba-4306-a7f0-bf95513c9068" (UID: "e289d3f6-26ba-4306-a7f0-bf95513c9068"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.824481 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e289d3f6-26ba-4306-a7f0-bf95513c9068" (UID: "e289d3f6-26ba-4306-a7f0-bf95513c9068"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.870275 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.870321 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.870336 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.870348 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e289d3f6-26ba-4306-a7f0-bf95513c9068-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:34 crc kubenswrapper[4772]: I0127 16:38:34.870361 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7grf\" (UniqueName: \"kubernetes.io/projected/e289d3f6-26ba-4306-a7f0-bf95513c9068-kube-api-access-q7grf\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.399637 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" event={"ID":"e289d3f6-26ba-4306-a7f0-bf95513c9068","Type":"ContainerDied","Data":"5dc3b25da49a88c2e2d810e005946006b5e97f2c6fb476e48c50681bedaf4daf"} Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.399660 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.399998 4772 scope.go:117] "RemoveContainer" containerID="de4a1855c8f97732a1ffd2229841c1e180e5f299f24cbbca78a154e8db2ccc0d" Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.420355 4772 scope.go:117] "RemoveContainer" containerID="cfd61d60b4611b222a3dcb5c92c6682ed0c719a75f85f6d5730b053534f9b1b0" Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.442502 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d645dd9d5-2pwb9"] Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.452488 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d645dd9d5-2pwb9"] Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.476572 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:35 crc kubenswrapper[4772]: I0127 16:38:35.538138 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5mjl"] Jan 27 16:38:36 crc kubenswrapper[4772]: I0127 16:38:36.682488 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" path="/var/lib/kubelet/pods/e289d3f6-26ba-4306-a7f0-bf95513c9068/volumes" Jan 27 16:38:37 crc kubenswrapper[4772]: I0127 16:38:37.423930 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p5mjl" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="registry-server" containerID="cri-o://1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7" gracePeriod=2 Jan 27 16:38:37 crc kubenswrapper[4772]: I0127 16:38:37.883260 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.036642 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-catalog-content\") pod \"d7637c27-38bb-4544-a948-040122a7d526\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.036767 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfw6t\" (UniqueName: \"kubernetes.io/projected/d7637c27-38bb-4544-a948-040122a7d526-kube-api-access-zfw6t\") pod \"d7637c27-38bb-4544-a948-040122a7d526\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.036914 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-utilities\") pod \"d7637c27-38bb-4544-a948-040122a7d526\" (UID: \"d7637c27-38bb-4544-a948-040122a7d526\") " Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.037670 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-utilities" (OuterVolumeSpecName: "utilities") pod "d7637c27-38bb-4544-a948-040122a7d526" (UID: "d7637c27-38bb-4544-a948-040122a7d526"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.045416 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7637c27-38bb-4544-a948-040122a7d526-kube-api-access-zfw6t" (OuterVolumeSpecName: "kube-api-access-zfw6t") pod "d7637c27-38bb-4544-a948-040122a7d526" (UID: "d7637c27-38bb-4544-a948-040122a7d526"). InnerVolumeSpecName "kube-api-access-zfw6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.086197 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7637c27-38bb-4544-a948-040122a7d526" (UID: "d7637c27-38bb-4544-a948-040122a7d526"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.138806 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.138849 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7637c27-38bb-4544-a948-040122a7d526-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.138862 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfw6t\" (UniqueName: \"kubernetes.io/projected/d7637c27-38bb-4544-a948-040122a7d526-kube-api-access-zfw6t\") on node \"crc\" DevicePath \"\"" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.451740 4772 generic.go:334] "Generic (PLEG): container finished" podID="d7637c27-38bb-4544-a948-040122a7d526" containerID="1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7" exitCode=0 Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.451788 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5mjl" event={"ID":"d7637c27-38bb-4544-a948-040122a7d526","Type":"ContainerDied","Data":"1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7"} Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.451821 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5mjl" event={"ID":"d7637c27-38bb-4544-a948-040122a7d526","Type":"ContainerDied","Data":"bde9c232190c4ff499588da71f6a6c3fb3565fcb2c79805b8745d1730e782d30"} Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.451838 4772 scope.go:117] "RemoveContainer" containerID="1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.451845 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5mjl" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.475538 4772 scope.go:117] "RemoveContainer" containerID="73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.492922 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5mjl"] Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.501052 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p5mjl"] Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.506336 4772 scope.go:117] "RemoveContainer" containerID="5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.552857 4772 scope.go:117] "RemoveContainer" containerID="1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7" Jan 27 16:38:38 crc kubenswrapper[4772]: E0127 16:38:38.553432 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7\": container with ID starting with 1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7 not found: ID does not exist" containerID="1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.553528 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7"} err="failed to get container status \"1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7\": rpc error: code = NotFound desc = could not find container \"1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7\": container with ID starting with 1c35a579a5c75625296e16ce590ed79bef653fbfe286f6422ef23d2e956388a7 not found: ID does not exist" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.553556 4772 scope.go:117] "RemoveContainer" containerID="73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48" Jan 27 16:38:38 crc kubenswrapper[4772]: E0127 16:38:38.553965 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48\": container with ID starting with 73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48 not found: ID does not exist" containerID="73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.554001 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48"} err="failed to get container status \"73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48\": rpc error: code = NotFound desc = could not find container \"73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48\": container with ID starting with 73a1bf579e1cad299ee8dcf75b2e0f5d995e618d33eecb56a1a42a2021cd5d48 not found: ID does not exist" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.554022 4772 scope.go:117] "RemoveContainer" containerID="5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687" Jan 27 16:38:38 crc kubenswrapper[4772]: E0127 16:38:38.554471 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687\": container with ID starting with 5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687 not found: ID does not exist" containerID="5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.554524 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687"} err="failed to get container status \"5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687\": rpc error: code = NotFound desc = could not find container \"5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687\": container with ID starting with 5f3e19efa8f94154ea817a07d57fc0bef91de6b3d9985a454f4b799359042687 not found: ID does not exist" Jan 27 16:38:38 crc kubenswrapper[4772]: I0127 16:38:38.674509 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7637c27-38bb-4544-a948-040122a7d526" path="/var/lib/kubelet/pods/d7637c27-38bb-4544-a948-040122a7d526/volumes" Jan 27 16:38:39 crc kubenswrapper[4772]: I0127 16:38:39.473877 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6d645dd9d5-2pwb9" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.34:5353: i/o timeout" Jan 27 16:38:59 crc kubenswrapper[4772]: I0127 16:38:59.319329 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:39:00 crc kubenswrapper[4772]: I0127 16:39:00.319398 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6ffd9fc5c6-99g52" Jan 27 16:39:06 crc kubenswrapper[4772]: I0127 16:39:06.992132 4772 scope.go:117] "RemoveContainer" containerID="f4958ec9744454169fb58baabe20204293a4ff4790174c9c2079b9801fd7028d" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.568455 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-5g8c9"] Jan 27 16:39:20 crc kubenswrapper[4772]: E0127 16:39:20.569324 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="dnsmasq-dns" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569338 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="dnsmasq-dns" Jan 27 16:39:20 crc kubenswrapper[4772]: E0127 16:39:20.569369 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="extract-utilities" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569378 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="extract-utilities" Jan 27 16:39:20 crc kubenswrapper[4772]: E0127 16:39:20.569391 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="init" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569399 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="init" Jan 27 16:39:20 crc kubenswrapper[4772]: E0127 16:39:20.569418 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="extract-content" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569426 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="extract-content" Jan 27 16:39:20 crc kubenswrapper[4772]: E0127 16:39:20.569437 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="registry-server" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569445 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="registry-server" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569636 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7637c27-38bb-4544-a948-040122a7d526" containerName="registry-server" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.569663 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e289d3f6-26ba-4306-a7f0-bf95513c9068" containerName="dnsmasq-dns" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.570400 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.581765 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5g8c9"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.649468 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wmvjd"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.650541 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.684417 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wmvjd"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.739057 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfb8z\" (UniqueName: \"kubernetes.io/projected/952d9a1e-efbf-4617-94af-b5ad42cce494-kube-api-access-bfb8z\") pod \"nova-api-db-create-5g8c9\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.739141 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgrrr\" (UniqueName: \"kubernetes.io/projected/81b3d773-720e-42c5-af9e-abddc2180ac7-kube-api-access-bgrrr\") pod \"nova-cell0-db-create-wmvjd\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.739293 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b3d773-720e-42c5-af9e-abddc2180ac7-operator-scripts\") pod \"nova-cell0-db-create-wmvjd\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.739475 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/952d9a1e-efbf-4617-94af-b5ad42cce494-operator-scripts\") pod \"nova-api-db-create-5g8c9\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.758494 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-fxhzl"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.759683 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.772084 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fxhzl"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.786238 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e098-account-create-update-s8qnw"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.787531 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.789748 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.795056 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e098-account-create-update-s8qnw"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.841064 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b3d773-720e-42c5-af9e-abddc2180ac7-operator-scripts\") pod \"nova-cell0-db-create-wmvjd\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.841191 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/952d9a1e-efbf-4617-94af-b5ad42cce494-operator-scripts\") pod \"nova-api-db-create-5g8c9\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.841914 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfb8z\" (UniqueName: \"kubernetes.io/projected/952d9a1e-efbf-4617-94af-b5ad42cce494-kube-api-access-bfb8z\") pod \"nova-api-db-create-5g8c9\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.841990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/952d9a1e-efbf-4617-94af-b5ad42cce494-operator-scripts\") pod \"nova-api-db-create-5g8c9\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.842005 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgrrr\" (UniqueName: \"kubernetes.io/projected/81b3d773-720e-42c5-af9e-abddc2180ac7-kube-api-access-bgrrr\") pod \"nova-cell0-db-create-wmvjd\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.842457 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b3d773-720e-42c5-af9e-abddc2180ac7-operator-scripts\") pod \"nova-cell0-db-create-wmvjd\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.860616 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfb8z\" (UniqueName: \"kubernetes.io/projected/952d9a1e-efbf-4617-94af-b5ad42cce494-kube-api-access-bfb8z\") pod \"nova-api-db-create-5g8c9\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.864282 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgrrr\" (UniqueName: \"kubernetes.io/projected/81b3d773-720e-42c5-af9e-abddc2180ac7-kube-api-access-bgrrr\") pod \"nova-cell0-db-create-wmvjd\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.893055 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.943594 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808813a-e588-4fb9-a15d-588d94a4cd59-operator-scripts\") pod \"nova-api-e098-account-create-update-s8qnw\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.943699 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxl7\" (UniqueName: \"kubernetes.io/projected/84c86af5-fd1f-4c53-8978-2b436db59b2a-kube-api-access-pzxl7\") pod \"nova-cell1-db-create-fxhzl\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.943727 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kb9n\" (UniqueName: \"kubernetes.io/projected/e808813a-e588-4fb9-a15d-588d94a4cd59-kube-api-access-7kb9n\") pod \"nova-api-e098-account-create-update-s8qnw\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.943813 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84c86af5-fd1f-4c53-8978-2b436db59b2a-operator-scripts\") pod \"nova-cell1-db-create-fxhzl\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.962718 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a20c-account-create-update-7pf8l"] Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.964097 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.967648 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.968150 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:20 crc kubenswrapper[4772]: I0127 16:39:20.980069 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a20c-account-create-update-7pf8l"] Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.045286 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-267vz\" (UniqueName: \"kubernetes.io/projected/86f6e427-99ce-4873-bacc-697edca3d34e-kube-api-access-267vz\") pod \"nova-cell0-a20c-account-create-update-7pf8l\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.045551 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzxl7\" (UniqueName: \"kubernetes.io/projected/84c86af5-fd1f-4c53-8978-2b436db59b2a-kube-api-access-pzxl7\") pod \"nova-cell1-db-create-fxhzl\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.045573 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kb9n\" (UniqueName: \"kubernetes.io/projected/e808813a-e588-4fb9-a15d-588d94a4cd59-kube-api-access-7kb9n\") pod \"nova-api-e098-account-create-update-s8qnw\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.045654 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86f6e427-99ce-4873-bacc-697edca3d34e-operator-scripts\") pod \"nova-cell0-a20c-account-create-update-7pf8l\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.045674 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84c86af5-fd1f-4c53-8978-2b436db59b2a-operator-scripts\") pod \"nova-cell1-db-create-fxhzl\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.045734 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808813a-e588-4fb9-a15d-588d94a4cd59-operator-scripts\") pod \"nova-api-e098-account-create-update-s8qnw\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.046541 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808813a-e588-4fb9-a15d-588d94a4cd59-operator-scripts\") pod \"nova-api-e098-account-create-update-s8qnw\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.046606 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84c86af5-fd1f-4c53-8978-2b436db59b2a-operator-scripts\") pod \"nova-cell1-db-create-fxhzl\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.066778 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kb9n\" (UniqueName: \"kubernetes.io/projected/e808813a-e588-4fb9-a15d-588d94a4cd59-kube-api-access-7kb9n\") pod \"nova-api-e098-account-create-update-s8qnw\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.067194 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzxl7\" (UniqueName: \"kubernetes.io/projected/84c86af5-fd1f-4c53-8978-2b436db59b2a-kube-api-access-pzxl7\") pod \"nova-cell1-db-create-fxhzl\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.079887 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.106158 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.148148 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-267vz\" (UniqueName: \"kubernetes.io/projected/86f6e427-99ce-4873-bacc-697edca3d34e-kube-api-access-267vz\") pod \"nova-cell0-a20c-account-create-update-7pf8l\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.148315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86f6e427-99ce-4873-bacc-697edca3d34e-operator-scripts\") pod \"nova-cell0-a20c-account-create-update-7pf8l\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.149280 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86f6e427-99ce-4873-bacc-697edca3d34e-operator-scripts\") pod \"nova-cell0-a20c-account-create-update-7pf8l\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.176662 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-267vz\" (UniqueName: \"kubernetes.io/projected/86f6e427-99ce-4873-bacc-697edca3d34e-kube-api-access-267vz\") pod \"nova-cell0-a20c-account-create-update-7pf8l\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.176667 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a831-account-create-update-bq6jl"] Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.178145 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.181583 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.185916 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a831-account-create-update-bq6jl"] Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.352729 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545581d-5f56-406d-938f-c3b073fdcbce-operator-scripts\") pod \"nova-cell1-a831-account-create-update-bq6jl\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.352773 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rddx\" (UniqueName: \"kubernetes.io/projected/4545581d-5f56-406d-938f-c3b073fdcbce-kube-api-access-2rddx\") pod \"nova-cell1-a831-account-create-update-bq6jl\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.370641 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.386494 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-5g8c9"] Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.457822 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545581d-5f56-406d-938f-c3b073fdcbce-operator-scripts\") pod \"nova-cell1-a831-account-create-update-bq6jl\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.458199 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rddx\" (UniqueName: \"kubernetes.io/projected/4545581d-5f56-406d-938f-c3b073fdcbce-kube-api-access-2rddx\") pod \"nova-cell1-a831-account-create-update-bq6jl\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.458512 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545581d-5f56-406d-938f-c3b073fdcbce-operator-scripts\") pod \"nova-cell1-a831-account-create-update-bq6jl\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.485565 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rddx\" (UniqueName: \"kubernetes.io/projected/4545581d-5f56-406d-938f-c3b073fdcbce-kube-api-access-2rddx\") pod \"nova-cell1-a831-account-create-update-bq6jl\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.491422 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wmvjd"] Jan 27 16:39:21 crc kubenswrapper[4772]: W0127 16:39:21.500615 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81b3d773_720e_42c5_af9e_abddc2180ac7.slice/crio-74aafe8a75a36a2526a4ba6482968b03120e31e15b066c2c8da6074effe77189 WatchSource:0}: Error finding container 74aafe8a75a36a2526a4ba6482968b03120e31e15b066c2c8da6074effe77189: Status 404 returned error can't find the container with id 74aafe8a75a36a2526a4ba6482968b03120e31e15b066c2c8da6074effe77189 Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.503150 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.663369 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e098-account-create-update-s8qnw"] Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.713030 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-fxhzl"] Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.817330 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a20c-account-create-update-7pf8l"] Jan 27 16:39:21 crc kubenswrapper[4772]: W0127 16:39:21.818106 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86f6e427_99ce_4873_bacc_697edca3d34e.slice/crio-84670f9b7280d1f9e799307995f1eb490a345f26043746f3bece904916c6717d WatchSource:0}: Error finding container 84670f9b7280d1f9e799307995f1eb490a345f26043746f3bece904916c6717d: Status 404 returned error can't find the container with id 84670f9b7280d1f9e799307995f1eb490a345f26043746f3bece904916c6717d Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.819791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wmvjd" event={"ID":"81b3d773-720e-42c5-af9e-abddc2180ac7","Type":"ContainerStarted","Data":"92b70acb37a2142424102ba84a18ba6908a56b707c2a540be2311cd899ea872a"} Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.819831 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wmvjd" event={"ID":"81b3d773-720e-42c5-af9e-abddc2180ac7","Type":"ContainerStarted","Data":"74aafe8a75a36a2526a4ba6482968b03120e31e15b066c2c8da6074effe77189"} Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.824632 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5g8c9" event={"ID":"952d9a1e-efbf-4617-94af-b5ad42cce494","Type":"ContainerStarted","Data":"cba54a4b30bd539d4b80fc206c2394f0309c6182e722300037332e6e39b8dedb"} Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.824683 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5g8c9" event={"ID":"952d9a1e-efbf-4617-94af-b5ad42cce494","Type":"ContainerStarted","Data":"681ac82fdd84aa4e5d105dd78003928a30e49c90e7fd8b4edd049cfc6b685eff"} Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.826896 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e098-account-create-update-s8qnw" event={"ID":"e808813a-e588-4fb9-a15d-588d94a4cd59","Type":"ContainerStarted","Data":"2a8731d36142a485e01f80917d9b4cd3eee05d2c848cd6dc5346e696b8c548cb"} Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.834643 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxhzl" event={"ID":"84c86af5-fd1f-4c53-8978-2b436db59b2a","Type":"ContainerStarted","Data":"9e71adf9731dfffc02261215475cf37fa6421ef0d218583ccfa458e659a17ae9"} Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.844150 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-wmvjd" podStartSLOduration=1.844123566 podStartE2EDuration="1.844123566s" podCreationTimestamp="2026-01-27 16:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:21.833981367 +0000 UTC m=+5547.814590475" watchObservedRunningTime="2026-01-27 16:39:21.844123566 +0000 UTC m=+5547.824732674" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.852998 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-5g8c9" podStartSLOduration=1.852975878 podStartE2EDuration="1.852975878s" podCreationTimestamp="2026-01-27 16:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:21.84811532 +0000 UTC m=+5547.828724428" watchObservedRunningTime="2026-01-27 16:39:21.852975878 +0000 UTC m=+5547.833584976" Jan 27 16:39:21 crc kubenswrapper[4772]: I0127 16:39:21.960097 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a831-account-create-update-bq6jl"] Jan 27 16:39:22 crc kubenswrapper[4772]: W0127 16:39:22.035667 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4545581d_5f56_406d_938f_c3b073fdcbce.slice/crio-e185de8351d3e108267a03aa84e999cda5068ec389181389ffbe49bdab27c598 WatchSource:0}: Error finding container e185de8351d3e108267a03aa84e999cda5068ec389181389ffbe49bdab27c598: Status 404 returned error can't find the container with id e185de8351d3e108267a03aa84e999cda5068ec389181389ffbe49bdab27c598 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.854812 4772 generic.go:334] "Generic (PLEG): container finished" podID="81b3d773-720e-42c5-af9e-abddc2180ac7" containerID="92b70acb37a2142424102ba84a18ba6908a56b707c2a540be2311cd899ea872a" exitCode=0 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.855198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wmvjd" event={"ID":"81b3d773-720e-42c5-af9e-abddc2180ac7","Type":"ContainerDied","Data":"92b70acb37a2142424102ba84a18ba6908a56b707c2a540be2311cd899ea872a"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.857803 4772 generic.go:334] "Generic (PLEG): container finished" podID="86f6e427-99ce-4873-bacc-697edca3d34e" containerID="83a6d65ba439c93f6ff663d0a68308cf85e00e3f86836413a3077a6bf72f351a" exitCode=0 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.857866 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" event={"ID":"86f6e427-99ce-4873-bacc-697edca3d34e","Type":"ContainerDied","Data":"83a6d65ba439c93f6ff663d0a68308cf85e00e3f86836413a3077a6bf72f351a"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.857885 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" event={"ID":"86f6e427-99ce-4873-bacc-697edca3d34e","Type":"ContainerStarted","Data":"84670f9b7280d1f9e799307995f1eb490a345f26043746f3bece904916c6717d"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.859658 4772 generic.go:334] "Generic (PLEG): container finished" podID="4545581d-5f56-406d-938f-c3b073fdcbce" containerID="0f3d5e3b05300094485e382bc00ab51b2c741fade4c1a775474788ecee8633d6" exitCode=0 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.859697 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" event={"ID":"4545581d-5f56-406d-938f-c3b073fdcbce","Type":"ContainerDied","Data":"0f3d5e3b05300094485e382bc00ab51b2c741fade4c1a775474788ecee8633d6"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.859725 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" event={"ID":"4545581d-5f56-406d-938f-c3b073fdcbce","Type":"ContainerStarted","Data":"e185de8351d3e108267a03aa84e999cda5068ec389181389ffbe49bdab27c598"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.861293 4772 generic.go:334] "Generic (PLEG): container finished" podID="952d9a1e-efbf-4617-94af-b5ad42cce494" containerID="cba54a4b30bd539d4b80fc206c2394f0309c6182e722300037332e6e39b8dedb" exitCode=0 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.861358 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5g8c9" event={"ID":"952d9a1e-efbf-4617-94af-b5ad42cce494","Type":"ContainerDied","Data":"cba54a4b30bd539d4b80fc206c2394f0309c6182e722300037332e6e39b8dedb"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.863053 4772 generic.go:334] "Generic (PLEG): container finished" podID="e808813a-e588-4fb9-a15d-588d94a4cd59" containerID="b2dc1b1a95dcd405b12ae74c124c7458f4aa13c48525ad20496373458b83b670" exitCode=0 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.863148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e098-account-create-update-s8qnw" event={"ID":"e808813a-e588-4fb9-a15d-588d94a4cd59","Type":"ContainerDied","Data":"b2dc1b1a95dcd405b12ae74c124c7458f4aa13c48525ad20496373458b83b670"} Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.869059 4772 generic.go:334] "Generic (PLEG): container finished" podID="84c86af5-fd1f-4c53-8978-2b436db59b2a" containerID="f05adee94e87980f08389d2716dd0d5a92a148aff23a1b26c057f06fd19c6f9b" exitCode=0 Jan 27 16:39:22 crc kubenswrapper[4772]: I0127 16:39:22.869105 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxhzl" event={"ID":"84c86af5-fd1f-4c53-8978-2b436db59b2a","Type":"ContainerDied","Data":"f05adee94e87980f08389d2716dd0d5a92a148aff23a1b26c057f06fd19c6f9b"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.229267 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.423734 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808813a-e588-4fb9-a15d-588d94a4cd59-operator-scripts\") pod \"e808813a-e588-4fb9-a15d-588d94a4cd59\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.423912 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kb9n\" (UniqueName: \"kubernetes.io/projected/e808813a-e588-4fb9-a15d-588d94a4cd59-kube-api-access-7kb9n\") pod \"e808813a-e588-4fb9-a15d-588d94a4cd59\" (UID: \"e808813a-e588-4fb9-a15d-588d94a4cd59\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.424268 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e808813a-e588-4fb9-a15d-588d94a4cd59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e808813a-e588-4fb9-a15d-588d94a4cd59" (UID: "e808813a-e588-4fb9-a15d-588d94a4cd59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.424538 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e808813a-e588-4fb9-a15d-588d94a4cd59-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.425993 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.429464 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e808813a-e588-4fb9-a15d-588d94a4cd59-kube-api-access-7kb9n" (OuterVolumeSpecName: "kube-api-access-7kb9n") pod "e808813a-e588-4fb9-a15d-588d94a4cd59" (UID: "e808813a-e588-4fb9-a15d-588d94a4cd59"). InnerVolumeSpecName "kube-api-access-7kb9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.430503 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.444758 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.448603 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.462051 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.525930 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kb9n\" (UniqueName: \"kubernetes.io/projected/e808813a-e588-4fb9-a15d-588d94a4cd59-kube-api-access-7kb9n\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.626974 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-267vz\" (UniqueName: \"kubernetes.io/projected/86f6e427-99ce-4873-bacc-697edca3d34e-kube-api-access-267vz\") pod \"86f6e427-99ce-4873-bacc-697edca3d34e\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627040 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84c86af5-fd1f-4c53-8978-2b436db59b2a-operator-scripts\") pod \"84c86af5-fd1f-4c53-8978-2b436db59b2a\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627068 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzxl7\" (UniqueName: \"kubernetes.io/projected/84c86af5-fd1f-4c53-8978-2b436db59b2a-kube-api-access-pzxl7\") pod \"84c86af5-fd1f-4c53-8978-2b436db59b2a\" (UID: \"84c86af5-fd1f-4c53-8978-2b436db59b2a\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627127 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgrrr\" (UniqueName: \"kubernetes.io/projected/81b3d773-720e-42c5-af9e-abddc2180ac7-kube-api-access-bgrrr\") pod \"81b3d773-720e-42c5-af9e-abddc2180ac7\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627186 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfb8z\" (UniqueName: \"kubernetes.io/projected/952d9a1e-efbf-4617-94af-b5ad42cce494-kube-api-access-bfb8z\") pod \"952d9a1e-efbf-4617-94af-b5ad42cce494\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627211 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b3d773-720e-42c5-af9e-abddc2180ac7-operator-scripts\") pod \"81b3d773-720e-42c5-af9e-abddc2180ac7\" (UID: \"81b3d773-720e-42c5-af9e-abddc2180ac7\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627244 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545581d-5f56-406d-938f-c3b073fdcbce-operator-scripts\") pod \"4545581d-5f56-406d-938f-c3b073fdcbce\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627263 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/952d9a1e-efbf-4617-94af-b5ad42cce494-operator-scripts\") pod \"952d9a1e-efbf-4617-94af-b5ad42cce494\" (UID: \"952d9a1e-efbf-4617-94af-b5ad42cce494\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627323 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rddx\" (UniqueName: \"kubernetes.io/projected/4545581d-5f56-406d-938f-c3b073fdcbce-kube-api-access-2rddx\") pod \"4545581d-5f56-406d-938f-c3b073fdcbce\" (UID: \"4545581d-5f56-406d-938f-c3b073fdcbce\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627358 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86f6e427-99ce-4873-bacc-697edca3d34e-operator-scripts\") pod \"86f6e427-99ce-4873-bacc-697edca3d34e\" (UID: \"86f6e427-99ce-4873-bacc-697edca3d34e\") " Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.627914 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84c86af5-fd1f-4c53-8978-2b436db59b2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84c86af5-fd1f-4c53-8978-2b436db59b2a" (UID: "84c86af5-fd1f-4c53-8978-2b436db59b2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.628012 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86f6e427-99ce-4873-bacc-697edca3d34e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86f6e427-99ce-4873-bacc-697edca3d34e" (UID: "86f6e427-99ce-4873-bacc-697edca3d34e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.628083 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4545581d-5f56-406d-938f-c3b073fdcbce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4545581d-5f56-406d-938f-c3b073fdcbce" (UID: "4545581d-5f56-406d-938f-c3b073fdcbce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.628327 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952d9a1e-efbf-4617-94af-b5ad42cce494-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "952d9a1e-efbf-4617-94af-b5ad42cce494" (UID: "952d9a1e-efbf-4617-94af-b5ad42cce494"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.628493 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b3d773-720e-42c5-af9e-abddc2180ac7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81b3d773-720e-42c5-af9e-abddc2180ac7" (UID: "81b3d773-720e-42c5-af9e-abddc2180ac7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.630975 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c86af5-fd1f-4c53-8978-2b436db59b2a-kube-api-access-pzxl7" (OuterVolumeSpecName: "kube-api-access-pzxl7") pod "84c86af5-fd1f-4c53-8978-2b436db59b2a" (UID: "84c86af5-fd1f-4c53-8978-2b436db59b2a"). InnerVolumeSpecName "kube-api-access-pzxl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.631343 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f6e427-99ce-4873-bacc-697edca3d34e-kube-api-access-267vz" (OuterVolumeSpecName: "kube-api-access-267vz") pod "86f6e427-99ce-4873-bacc-697edca3d34e" (UID: "86f6e427-99ce-4873-bacc-697edca3d34e"). InnerVolumeSpecName "kube-api-access-267vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.631409 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952d9a1e-efbf-4617-94af-b5ad42cce494-kube-api-access-bfb8z" (OuterVolumeSpecName: "kube-api-access-bfb8z") pod "952d9a1e-efbf-4617-94af-b5ad42cce494" (UID: "952d9a1e-efbf-4617-94af-b5ad42cce494"). InnerVolumeSpecName "kube-api-access-bfb8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.631913 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b3d773-720e-42c5-af9e-abddc2180ac7-kube-api-access-bgrrr" (OuterVolumeSpecName: "kube-api-access-bgrrr") pod "81b3d773-720e-42c5-af9e-abddc2180ac7" (UID: "81b3d773-720e-42c5-af9e-abddc2180ac7"). InnerVolumeSpecName "kube-api-access-bgrrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.633262 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4545581d-5f56-406d-938f-c3b073fdcbce-kube-api-access-2rddx" (OuterVolumeSpecName: "kube-api-access-2rddx") pod "4545581d-5f56-406d-938f-c3b073fdcbce" (UID: "4545581d-5f56-406d-938f-c3b073fdcbce"). InnerVolumeSpecName "kube-api-access-2rddx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729344 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rddx\" (UniqueName: \"kubernetes.io/projected/4545581d-5f56-406d-938f-c3b073fdcbce-kube-api-access-2rddx\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729436 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86f6e427-99ce-4873-bacc-697edca3d34e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729460 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-267vz\" (UniqueName: \"kubernetes.io/projected/86f6e427-99ce-4873-bacc-697edca3d34e-kube-api-access-267vz\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729478 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84c86af5-fd1f-4c53-8978-2b436db59b2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729488 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzxl7\" (UniqueName: \"kubernetes.io/projected/84c86af5-fd1f-4c53-8978-2b436db59b2a-kube-api-access-pzxl7\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729497 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgrrr\" (UniqueName: \"kubernetes.io/projected/81b3d773-720e-42c5-af9e-abddc2180ac7-kube-api-access-bgrrr\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729507 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfb8z\" (UniqueName: \"kubernetes.io/projected/952d9a1e-efbf-4617-94af-b5ad42cce494-kube-api-access-bfb8z\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729515 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b3d773-720e-42c5-af9e-abddc2180ac7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729524 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4545581d-5f56-406d-938f-c3b073fdcbce-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.729532 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/952d9a1e-efbf-4617-94af-b5ad42cce494-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.887427 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.887421 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a831-account-create-update-bq6jl" event={"ID":"4545581d-5f56-406d-938f-c3b073fdcbce","Type":"ContainerDied","Data":"e185de8351d3e108267a03aa84e999cda5068ec389181389ffbe49bdab27c598"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.887998 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e185de8351d3e108267a03aa84e999cda5068ec389181389ffbe49bdab27c598" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.888991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-5g8c9" event={"ID":"952d9a1e-efbf-4617-94af-b5ad42cce494","Type":"ContainerDied","Data":"681ac82fdd84aa4e5d105dd78003928a30e49c90e7fd8b4edd049cfc6b685eff"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.889019 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="681ac82fdd84aa4e5d105dd78003928a30e49c90e7fd8b4edd049cfc6b685eff" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.889068 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-5g8c9" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.891568 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e098-account-create-update-s8qnw" event={"ID":"e808813a-e588-4fb9-a15d-588d94a4cd59","Type":"ContainerDied","Data":"2a8731d36142a485e01f80917d9b4cd3eee05d2c848cd6dc5346e696b8c548cb"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.891590 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8731d36142a485e01f80917d9b4cd3eee05d2c848cd6dc5346e696b8c548cb" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.891627 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e098-account-create-update-s8qnw" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.894513 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-fxhzl" event={"ID":"84c86af5-fd1f-4c53-8978-2b436db59b2a","Type":"ContainerDied","Data":"9e71adf9731dfffc02261215475cf37fa6421ef0d218583ccfa458e659a17ae9"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.894553 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e71adf9731dfffc02261215475cf37fa6421ef0d218583ccfa458e659a17ae9" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.894579 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-fxhzl" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.896513 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wmvjd" event={"ID":"81b3d773-720e-42c5-af9e-abddc2180ac7","Type":"ContainerDied","Data":"74aafe8a75a36a2526a4ba6482968b03120e31e15b066c2c8da6074effe77189"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.896535 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wmvjd" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.896549 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74aafe8a75a36a2526a4ba6482968b03120e31e15b066c2c8da6074effe77189" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.899370 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" event={"ID":"86f6e427-99ce-4873-bacc-697edca3d34e","Type":"ContainerDied","Data":"84670f9b7280d1f9e799307995f1eb490a345f26043746f3bece904916c6717d"} Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.899391 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84670f9b7280d1f9e799307995f1eb490a345f26043746f3bece904916c6717d" Jan 27 16:39:24 crc kubenswrapper[4772]: I0127 16:39:24.899512 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a20c-account-create-update-7pf8l" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141066 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k56t7"] Jan 27 16:39:26 crc kubenswrapper[4772]: E0127 16:39:26.141822 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e808813a-e588-4fb9-a15d-588d94a4cd59" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141838 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e808813a-e588-4fb9-a15d-588d94a4cd59" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: E0127 16:39:26.141874 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f6e427-99ce-4873-bacc-697edca3d34e" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141882 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f6e427-99ce-4873-bacc-697edca3d34e" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: E0127 16:39:26.141894 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952d9a1e-efbf-4617-94af-b5ad42cce494" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141901 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="952d9a1e-efbf-4617-94af-b5ad42cce494" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: E0127 16:39:26.141912 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4545581d-5f56-406d-938f-c3b073fdcbce" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141919 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4545581d-5f56-406d-938f-c3b073fdcbce" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: E0127 16:39:26.141933 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c86af5-fd1f-4c53-8978-2b436db59b2a" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141941 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c86af5-fd1f-4c53-8978-2b436db59b2a" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: E0127 16:39:26.141956 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b3d773-720e-42c5-af9e-abddc2180ac7" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.141965 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b3d773-720e-42c5-af9e-abddc2180ac7" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142149 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="952d9a1e-efbf-4617-94af-b5ad42cce494" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142163 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f6e427-99ce-4873-bacc-697edca3d34e" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142195 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4545581d-5f56-406d-938f-c3b073fdcbce" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142220 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b3d773-720e-42c5-af9e-abddc2180ac7" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142254 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c86af5-fd1f-4c53-8978-2b436db59b2a" containerName="mariadb-database-create" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142264 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e808813a-e588-4fb9-a15d-588d94a4cd59" containerName="mariadb-account-create-update" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.142935 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.145224 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6wktq" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.149359 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.155253 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k56t7"] Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.169717 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.254286 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-scripts\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.254362 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.254481 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-config-data\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.254567 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbf8m\" (UniqueName: \"kubernetes.io/projected/eedd23ad-e532-401a-a991-4bca54fc2711-kube-api-access-hbf8m\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.356645 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.356733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-config-data\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.356787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbf8m\" (UniqueName: \"kubernetes.io/projected/eedd23ad-e532-401a-a991-4bca54fc2711-kube-api-access-hbf8m\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.356849 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-scripts\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.368568 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-scripts\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.369177 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.383149 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-config-data\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.385294 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbf8m\" (UniqueName: \"kubernetes.io/projected/eedd23ad-e532-401a-a991-4bca54fc2711-kube-api-access-hbf8m\") pod \"nova-cell0-conductor-db-sync-k56t7\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.484855 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:26 crc kubenswrapper[4772]: I0127 16:39:26.984397 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k56t7"] Jan 27 16:39:26 crc kubenswrapper[4772]: W0127 16:39:26.984965 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeedd23ad_e532_401a_a991_4bca54fc2711.slice/crio-9a78b7affa413b3705f382390b02b775a4a3f330d40d06789b404d757989470f WatchSource:0}: Error finding container 9a78b7affa413b3705f382390b02b775a4a3f330d40d06789b404d757989470f: Status 404 returned error can't find the container with id 9a78b7affa413b3705f382390b02b775a4a3f330d40d06789b404d757989470f Jan 27 16:39:27 crc kubenswrapper[4772]: I0127 16:39:27.960358 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k56t7" event={"ID":"eedd23ad-e532-401a-a991-4bca54fc2711","Type":"ContainerStarted","Data":"63117e003669f061743dd5211454454c043cc51173da6608feb5766b651070d3"} Jan 27 16:39:27 crc kubenswrapper[4772]: I0127 16:39:27.960684 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k56t7" event={"ID":"eedd23ad-e532-401a-a991-4bca54fc2711","Type":"ContainerStarted","Data":"9a78b7affa413b3705f382390b02b775a4a3f330d40d06789b404d757989470f"} Jan 27 16:39:37 crc kubenswrapper[4772]: I0127 16:39:37.036903 4772 generic.go:334] "Generic (PLEG): container finished" podID="eedd23ad-e532-401a-a991-4bca54fc2711" containerID="63117e003669f061743dd5211454454c043cc51173da6608feb5766b651070d3" exitCode=0 Jan 27 16:39:37 crc kubenswrapper[4772]: I0127 16:39:37.037031 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k56t7" event={"ID":"eedd23ad-e532-401a-a991-4bca54fc2711","Type":"ContainerDied","Data":"63117e003669f061743dd5211454454c043cc51173da6608feb5766b651070d3"} Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.369363 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.497533 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-config-data\") pod \"eedd23ad-e532-401a-a991-4bca54fc2711\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.497665 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-scripts\") pod \"eedd23ad-e532-401a-a991-4bca54fc2711\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.497785 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-combined-ca-bundle\") pod \"eedd23ad-e532-401a-a991-4bca54fc2711\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.497846 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbf8m\" (UniqueName: \"kubernetes.io/projected/eedd23ad-e532-401a-a991-4bca54fc2711-kube-api-access-hbf8m\") pod \"eedd23ad-e532-401a-a991-4bca54fc2711\" (UID: \"eedd23ad-e532-401a-a991-4bca54fc2711\") " Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.503736 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-scripts" (OuterVolumeSpecName: "scripts") pod "eedd23ad-e532-401a-a991-4bca54fc2711" (UID: "eedd23ad-e532-401a-a991-4bca54fc2711"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.503921 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eedd23ad-e532-401a-a991-4bca54fc2711-kube-api-access-hbf8m" (OuterVolumeSpecName: "kube-api-access-hbf8m") pod "eedd23ad-e532-401a-a991-4bca54fc2711" (UID: "eedd23ad-e532-401a-a991-4bca54fc2711"). InnerVolumeSpecName "kube-api-access-hbf8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.523063 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eedd23ad-e532-401a-a991-4bca54fc2711" (UID: "eedd23ad-e532-401a-a991-4bca54fc2711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.524487 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-config-data" (OuterVolumeSpecName: "config-data") pod "eedd23ad-e532-401a-a991-4bca54fc2711" (UID: "eedd23ad-e532-401a-a991-4bca54fc2711"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.600494 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.600731 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbf8m\" (UniqueName: \"kubernetes.io/projected/eedd23ad-e532-401a-a991-4bca54fc2711-kube-api-access-hbf8m\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.600791 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:38 crc kubenswrapper[4772]: I0127 16:39:38.600857 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eedd23ad-e532-401a-a991-4bca54fc2711-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.058506 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k56t7" event={"ID":"eedd23ad-e532-401a-a991-4bca54fc2711","Type":"ContainerDied","Data":"9a78b7affa413b3705f382390b02b775a4a3f330d40d06789b404d757989470f"} Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.058546 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a78b7affa413b3705f382390b02b775a4a3f330d40d06789b404d757989470f" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.058592 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k56t7" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.156809 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:39:39 crc kubenswrapper[4772]: E0127 16:39:39.157153 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eedd23ad-e532-401a-a991-4bca54fc2711" containerName="nova-cell0-conductor-db-sync" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.157185 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eedd23ad-e532-401a-a991-4bca54fc2711" containerName="nova-cell0-conductor-db-sync" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.157324 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eedd23ad-e532-401a-a991-4bca54fc2711" containerName="nova-cell0-conductor-db-sync" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.157922 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.159971 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6wktq" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.161552 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.179897 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.313011 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7l8\" (UniqueName: \"kubernetes.io/projected/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-kube-api-access-2j7l8\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.313145 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.313397 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.414913 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.415007 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.415099 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7l8\" (UniqueName: \"kubernetes.io/projected/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-kube-api-access-2j7l8\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.419868 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.422665 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.441984 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7l8\" (UniqueName: \"kubernetes.io/projected/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-kube-api-access-2j7l8\") pod \"nova-cell0-conductor-0\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:39 crc kubenswrapper[4772]: I0127 16:39:39.512397 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:40 crc kubenswrapper[4772]: I0127 16:39:40.003674 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:39:40 crc kubenswrapper[4772]: I0127 16:39:40.074006 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e","Type":"ContainerStarted","Data":"30ded49d3176ba704104aa049f568d3100eaac27bd25914a3c0f0f06ce633e73"} Jan 27 16:39:41 crc kubenswrapper[4772]: I0127 16:39:41.084501 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e","Type":"ContainerStarted","Data":"43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50"} Jan 27 16:39:41 crc kubenswrapper[4772]: I0127 16:39:41.085015 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:41 crc kubenswrapper[4772]: I0127 16:39:41.099349 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.099332574 podStartE2EDuration="2.099332574s" podCreationTimestamp="2026-01-27 16:39:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:41.097772759 +0000 UTC m=+5567.078381867" watchObservedRunningTime="2026-01-27 16:39:41.099332574 +0000 UTC m=+5567.079941672" Jan 27 16:39:49 crc kubenswrapper[4772]: I0127 16:39:49.562904 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.011711 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-9vmft"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.013269 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.015668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.023216 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.023788 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9vmft"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.114898 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-scripts\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.114999 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-config-data\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.115101 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjnmp\" (UniqueName: \"kubernetes.io/projected/f4c79395-a929-4f0d-8aa7-05f24412baed-kube-api-access-mjnmp\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.115131 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.164704 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.165795 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.168315 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.180803 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.217606 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjnmp\" (UniqueName: \"kubernetes.io/projected/f4c79395-a929-4f0d-8aa7-05f24412baed-kube-api-access-mjnmp\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.217652 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.217713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-scripts\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.217769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-config-data\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.236968 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-scripts\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.237689 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.238607 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-config-data\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.267864 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.270120 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.280755 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.287798 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjnmp\" (UniqueName: \"kubernetes.io/projected/f4c79395-a929-4f0d-8aa7-05f24412baed-kube-api-access-mjnmp\") pod \"nova-cell0-cell-mapping-9vmft\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.318815 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.318903 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqgtr\" (UniqueName: \"kubernetes.io/projected/26d4226f-e574-498f-a2ce-e0db8f83a8d3-kube-api-access-tqgtr\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.318964 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-config-data\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.325240 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.341601 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.375649 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.377148 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.382010 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.409224 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426205 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426283 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-config-data\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426409 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp727\" (UniqueName: \"kubernetes.io/projected/9f993d38-e5bb-4ce4-8a9f-269695614f5e-kube-api-access-bp727\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-config-data\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqgtr\" (UniqueName: \"kubernetes.io/projected/26d4226f-e574-498f-a2ce-e0db8f83a8d3-kube-api-access-tqgtr\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.426577 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f993d38-e5bb-4ce4-8a9f-269695614f5e-logs\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.435930 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.441995 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-config-data\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.510399 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqgtr\" (UniqueName: \"kubernetes.io/projected/26d4226f-e574-498f-a2ce-e0db8f83a8d3-kube-api-access-tqgtr\") pod \"nova-scheduler-0\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.530277 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-config-data\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.530679 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f993d38-e5bb-4ce4-8a9f-269695614f5e-logs\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.530810 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e14057-d3be-433e-97a7-0b36f18382ee-logs\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.530928 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.531065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfs5m\" (UniqueName: \"kubernetes.io/projected/e3e14057-d3be-433e-97a7-0b36f18382ee-kube-api-access-mfs5m\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.531210 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.531361 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-config-data\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.531522 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp727\" (UniqueName: \"kubernetes.io/projected/9f993d38-e5bb-4ce4-8a9f-269695614f5e-kube-api-access-bp727\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.534785 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.541153 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f993d38-e5bb-4ce4-8a9f-269695614f5e-logs\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.541743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.542060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-config-data\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.569502 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7758766f57-6fk67"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.571273 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.604048 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp727\" (UniqueName: \"kubernetes.io/projected/9f993d38-e5bb-4ce4-8a9f-269695614f5e-kube-api-access-bp727\") pod \"nova-metadata-0\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.625082 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7758766f57-6fk67"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633340 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e14057-d3be-433e-97a7-0b36f18382ee-logs\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633413 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfs5m\" (UniqueName: \"kubernetes.io/projected/e3e14057-d3be-433e-97a7-0b36f18382ee-kube-api-access-mfs5m\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633444 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633476 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-config\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633509 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-config-data\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633552 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv5xf\" (UniqueName: \"kubernetes.io/projected/f90709ae-7118-42ca-aad1-922961a6f858-kube-api-access-bv5xf\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-nb\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633612 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-sb\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.633633 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-dns-svc\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.634017 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e14057-d3be-433e-97a7-0b36f18382ee-logs\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.637982 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.640773 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-config-data\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.673972 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfs5m\" (UniqueName: \"kubernetes.io/projected/e3e14057-d3be-433e-97a7-0b36f18382ee-kube-api-access-mfs5m\") pod \"nova-api-0\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.732046 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.733195 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.736255 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv5xf\" (UniqueName: \"kubernetes.io/projected/f90709ae-7118-42ca-aad1-922961a6f858-kube-api-access-bv5xf\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.736322 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-nb\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.736375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-sb\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.736401 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-dns-svc\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.736534 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-config\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.737472 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-config\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.737563 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.738426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-sb\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.738538 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-dns-svc\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.739101 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-nb\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.773952 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv5xf\" (UniqueName: \"kubernetes.io/projected/f90709ae-7118-42ca-aad1-922961a6f858-kube-api-access-bv5xf\") pod \"dnsmasq-dns-7758766f57-6fk67\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.775792 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.789255 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.805710 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.839949 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22khv\" (UniqueName: \"kubernetes.io/projected/a30c12eb-4e83-420a-8064-859689d91d2d-kube-api-access-22khv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.840490 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.840573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.945874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.946094 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22khv\" (UniqueName: \"kubernetes.io/projected/a30c12eb-4e83-420a-8064-859689d91d2d-kube-api-access-22khv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.946135 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.956630 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.962592 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.977965 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22khv\" (UniqueName: \"kubernetes.io/projected/a30c12eb-4e83-420a-8064-859689d91d2d-kube-api-access-22khv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:50 crc kubenswrapper[4772]: I0127 16:39:50.987033 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.101072 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.352313 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-9vmft"] Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.399824 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:39:51 crc kubenswrapper[4772]: W0127 16:39:51.408141 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d4226f_e574_498f_a2ce_e0db8f83a8d3.slice/crio-1c139ace8ceb66ebe7302c272fa99b828aba2d79a11db30142d7bb5b26746bbd WatchSource:0}: Error finding container 1c139ace8ceb66ebe7302c272fa99b828aba2d79a11db30142d7bb5b26746bbd: Status 404 returned error can't find the container with id 1c139ace8ceb66ebe7302c272fa99b828aba2d79a11db30142d7bb5b26746bbd Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.526177 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.534831 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:39:51 crc kubenswrapper[4772]: W0127 16:39:51.542571 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f993d38_e5bb_4ce4_8a9f_269695614f5e.slice/crio-03b13462263b2fb77b677576417743ca4d4b9b8dce8807867be92ad60cefd791 WatchSource:0}: Error finding container 03b13462263b2fb77b677576417743ca4d4b9b8dce8807867be92ad60cefd791: Status 404 returned error can't find the container with id 03b13462263b2fb77b677576417743ca4d4b9b8dce8807867be92ad60cefd791 Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.666812 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7758766f57-6fk67"] Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.699181 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dk47"] Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.700257 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.704000 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.704393 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.712214 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dk47"] Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.767762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zgx\" (UniqueName: \"kubernetes.io/projected/c73df4be-e448-4930-ae5e-d74fde1b4b6d-kube-api-access-q7zgx\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.767853 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-config-data\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.767881 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-scripts\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.767971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.803916 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.870251 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.870725 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zgx\" (UniqueName: \"kubernetes.io/projected/c73df4be-e448-4930-ae5e-d74fde1b4b6d-kube-api-access-q7zgx\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.870800 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-config-data\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.870827 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-scripts\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.887913 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.888522 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-config-data\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.888855 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-scripts\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:51 crc kubenswrapper[4772]: I0127 16:39:51.894375 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zgx\" (UniqueName: \"kubernetes.io/projected/c73df4be-e448-4930-ae5e-d74fde1b4b6d-kube-api-access-q7zgx\") pod \"nova-cell1-conductor-db-sync-9dk47\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.028812 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.234816 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9vmft" event={"ID":"f4c79395-a929-4f0d-8aa7-05f24412baed","Type":"ContainerStarted","Data":"cd101790786da89124bf4fdc7b4bace7b135a40af114a9eeb7dbc7a4372fd732"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.234875 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9vmft" event={"ID":"f4c79395-a929-4f0d-8aa7-05f24412baed","Type":"ContainerStarted","Data":"52cbee2f5d4eb8ed937fe3376681b915bcc858435daf4a58153a4c6bb2cb935d"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.256412 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e14057-d3be-433e-97a7-0b36f18382ee","Type":"ContainerStarted","Data":"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.256472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e14057-d3be-433e-97a7-0b36f18382ee","Type":"ContainerStarted","Data":"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.256486 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e14057-d3be-433e-97a7-0b36f18382ee","Type":"ContainerStarted","Data":"44267cacdffa14217dc9859dca23e969af5b7adbb4750d9c4f117ba9ccd3c77f"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.259196 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"26d4226f-e574-498f-a2ce-e0db8f83a8d3","Type":"ContainerStarted","Data":"d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.259226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"26d4226f-e574-498f-a2ce-e0db8f83a8d3","Type":"ContainerStarted","Data":"1c139ace8ceb66ebe7302c272fa99b828aba2d79a11db30142d7bb5b26746bbd"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.270139 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a30c12eb-4e83-420a-8064-859689d91d2d","Type":"ContainerStarted","Data":"fba71bd5b1f04f4ce21ecca46c2028b0aeffaa1f208e5b17978c2d0f906cbf36"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.270201 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a30c12eb-4e83-420a-8064-859689d91d2d","Type":"ContainerStarted","Data":"772c077f79e9fa947b145341393fdb1b1ddc0a4d4b6ec55b124c53c8a8bb5534"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.281375 4772 generic.go:334] "Generic (PLEG): container finished" podID="f90709ae-7118-42ca-aad1-922961a6f858" containerID="ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51" exitCode=0 Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.281766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7758766f57-6fk67" event={"ID":"f90709ae-7118-42ca-aad1-922961a6f858","Type":"ContainerDied","Data":"ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.281795 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7758766f57-6fk67" event={"ID":"f90709ae-7118-42ca-aad1-922961a6f858","Type":"ContainerStarted","Data":"b319140f655aa1b23b1e685173361086b6144080f0c58e842877b6cfd23b2922"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.305394 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f993d38-e5bb-4ce4-8a9f-269695614f5e","Type":"ContainerStarted","Data":"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.305635 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f993d38-e5bb-4ce4-8a9f-269695614f5e","Type":"ContainerStarted","Data":"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.305646 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f993d38-e5bb-4ce4-8a9f-269695614f5e","Type":"ContainerStarted","Data":"03b13462263b2fb77b677576417743ca4d4b9b8dce8807867be92ad60cefd791"} Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.316629 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-9vmft" podStartSLOduration=3.316433564 podStartE2EDuration="3.316433564s" podCreationTimestamp="2026-01-27 16:39:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:52.282692362 +0000 UTC m=+5578.263301460" watchObservedRunningTime="2026-01-27 16:39:52.316433564 +0000 UTC m=+5578.297042662" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.356442 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.356422594 podStartE2EDuration="2.356422594s" podCreationTimestamp="2026-01-27 16:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:52.330021471 +0000 UTC m=+5578.310630569" watchObservedRunningTime="2026-01-27 16:39:52.356422594 +0000 UTC m=+5578.337031692" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.359914 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.359903673 podStartE2EDuration="2.359903673s" podCreationTimestamp="2026-01-27 16:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:52.352325017 +0000 UTC m=+5578.332934125" watchObservedRunningTime="2026-01-27 16:39:52.359903673 +0000 UTC m=+5578.340512771" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.417696 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.41767385 podStartE2EDuration="2.41767385s" podCreationTimestamp="2026-01-27 16:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:52.395210519 +0000 UTC m=+5578.375819627" watchObservedRunningTime="2026-01-27 16:39:52.41767385 +0000 UTC m=+5578.398282938" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.426564 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.426540122 podStartE2EDuration="2.426540122s" podCreationTimestamp="2026-01-27 16:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:52.41278644 +0000 UTC m=+5578.393395548" watchObservedRunningTime="2026-01-27 16:39:52.426540122 +0000 UTC m=+5578.407149230" Jan 27 16:39:52 crc kubenswrapper[4772]: I0127 16:39:52.582783 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dk47"] Jan 27 16:39:53 crc kubenswrapper[4772]: I0127 16:39:53.317281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7758766f57-6fk67" event={"ID":"f90709ae-7118-42ca-aad1-922961a6f858","Type":"ContainerStarted","Data":"97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5"} Jan 27 16:39:53 crc kubenswrapper[4772]: I0127 16:39:53.317749 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:39:53 crc kubenswrapper[4772]: I0127 16:39:53.319107 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dk47" event={"ID":"c73df4be-e448-4930-ae5e-d74fde1b4b6d","Type":"ContainerStarted","Data":"4799caa746e9ac611c5360c3f4fb6d88e8100346e94f7137cac433d747b98815"} Jan 27 16:39:53 crc kubenswrapper[4772]: I0127 16:39:53.319156 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dk47" event={"ID":"c73df4be-e448-4930-ae5e-d74fde1b4b6d","Type":"ContainerStarted","Data":"916e7f8ec2b785139b190369d0dd78267c4fd066758c7d8bf358fac5d60afeb5"} Jan 27 16:39:53 crc kubenswrapper[4772]: I0127 16:39:53.355871 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9dk47" podStartSLOduration=2.355854162 podStartE2EDuration="2.355854162s" podCreationTimestamp="2026-01-27 16:39:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:53.354893244 +0000 UTC m=+5579.335502342" watchObservedRunningTime="2026-01-27 16:39:53.355854162 +0000 UTC m=+5579.336463260" Jan 27 16:39:53 crc kubenswrapper[4772]: I0127 16:39:53.359658 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7758766f57-6fk67" podStartSLOduration=3.35964491 podStartE2EDuration="3.35964491s" podCreationTimestamp="2026-01-27 16:39:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:39:53.34143194 +0000 UTC m=+5579.322041038" watchObservedRunningTime="2026-01-27 16:39:53.35964491 +0000 UTC m=+5579.340254008" Jan 27 16:39:55 crc kubenswrapper[4772]: I0127 16:39:55.541273 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 16:39:55 crc kubenswrapper[4772]: I0127 16:39:55.789748 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:39:55 crc kubenswrapper[4772]: I0127 16:39:55.789820 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:39:56 crc kubenswrapper[4772]: I0127 16:39:56.102756 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:39:56 crc kubenswrapper[4772]: I0127 16:39:56.342743 4772 generic.go:334] "Generic (PLEG): container finished" podID="c73df4be-e448-4930-ae5e-d74fde1b4b6d" containerID="4799caa746e9ac611c5360c3f4fb6d88e8100346e94f7137cac433d747b98815" exitCode=0 Jan 27 16:39:56 crc kubenswrapper[4772]: I0127 16:39:56.342788 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dk47" event={"ID":"c73df4be-e448-4930-ae5e-d74fde1b4b6d","Type":"ContainerDied","Data":"4799caa746e9ac611c5360c3f4fb6d88e8100346e94f7137cac433d747b98815"} Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.354542 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4c79395-a929-4f0d-8aa7-05f24412baed" containerID="cd101790786da89124bf4fdc7b4bace7b135a40af114a9eeb7dbc7a4372fd732" exitCode=0 Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.354605 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9vmft" event={"ID":"f4c79395-a929-4f0d-8aa7-05f24412baed","Type":"ContainerDied","Data":"cd101790786da89124bf4fdc7b4bace7b135a40af114a9eeb7dbc7a4372fd732"} Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.790246 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.904460 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-scripts\") pod \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.904632 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-config-data\") pod \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.904666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-combined-ca-bundle\") pod \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.904713 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7zgx\" (UniqueName: \"kubernetes.io/projected/c73df4be-e448-4930-ae5e-d74fde1b4b6d-kube-api-access-q7zgx\") pod \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\" (UID: \"c73df4be-e448-4930-ae5e-d74fde1b4b6d\") " Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.911615 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73df4be-e448-4930-ae5e-d74fde1b4b6d-kube-api-access-q7zgx" (OuterVolumeSpecName: "kube-api-access-q7zgx") pod "c73df4be-e448-4930-ae5e-d74fde1b4b6d" (UID: "c73df4be-e448-4930-ae5e-d74fde1b4b6d"). InnerVolumeSpecName "kube-api-access-q7zgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.911926 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-scripts" (OuterVolumeSpecName: "scripts") pod "c73df4be-e448-4930-ae5e-d74fde1b4b6d" (UID: "c73df4be-e448-4930-ae5e-d74fde1b4b6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.932879 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c73df4be-e448-4930-ae5e-d74fde1b4b6d" (UID: "c73df4be-e448-4930-ae5e-d74fde1b4b6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:57 crc kubenswrapper[4772]: I0127 16:39:57.950592 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-config-data" (OuterVolumeSpecName: "config-data") pod "c73df4be-e448-4930-ae5e-d74fde1b4b6d" (UID: "c73df4be-e448-4930-ae5e-d74fde1b4b6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.007028 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.007060 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.007073 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c73df4be-e448-4930-ae5e-d74fde1b4b6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.007083 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7zgx\" (UniqueName: \"kubernetes.io/projected/c73df4be-e448-4930-ae5e-d74fde1b4b6d-kube-api-access-q7zgx\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.365485 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9dk47" event={"ID":"c73df4be-e448-4930-ae5e-d74fde1b4b6d","Type":"ContainerDied","Data":"916e7f8ec2b785139b190369d0dd78267c4fd066758c7d8bf358fac5d60afeb5"} Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.365529 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916e7f8ec2b785139b190369d0dd78267c4fd066758c7d8bf358fac5d60afeb5" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.365582 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9dk47" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.442100 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:39:58 crc kubenswrapper[4772]: E0127 16:39:58.442500 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73df4be-e448-4930-ae5e-d74fde1b4b6d" containerName="nova-cell1-conductor-db-sync" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.442514 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73df4be-e448-4930-ae5e-d74fde1b4b6d" containerName="nova-cell1-conductor-db-sync" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.442673 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73df4be-e448-4930-ae5e-d74fde1b4b6d" containerName="nova-cell1-conductor-db-sync" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.443226 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.444902 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.454406 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.514983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhmd\" (UniqueName: \"kubernetes.io/projected/e1c55305-7d18-44cb-90e5-b6793989abda-kube-api-access-vdhmd\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.515036 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.515073 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.617222 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhmd\" (UniqueName: \"kubernetes.io/projected/e1c55305-7d18-44cb-90e5-b6793989abda-kube-api-access-vdhmd\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.617283 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.617328 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.620913 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.622001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.640398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhmd\" (UniqueName: \"kubernetes.io/projected/e1c55305-7d18-44cb-90e5-b6793989abda-kube-api-access-vdhmd\") pod \"nova-cell1-conductor-0\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.763826 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 16:39:58 crc kubenswrapper[4772]: I0127 16:39:58.882450 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.024442 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-config-data\") pod \"f4c79395-a929-4f0d-8aa7-05f24412baed\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.024534 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-scripts\") pod \"f4c79395-a929-4f0d-8aa7-05f24412baed\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.024652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjnmp\" (UniqueName: \"kubernetes.io/projected/f4c79395-a929-4f0d-8aa7-05f24412baed-kube-api-access-mjnmp\") pod \"f4c79395-a929-4f0d-8aa7-05f24412baed\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.024714 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-combined-ca-bundle\") pod \"f4c79395-a929-4f0d-8aa7-05f24412baed\" (UID: \"f4c79395-a929-4f0d-8aa7-05f24412baed\") " Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.038051 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-scripts" (OuterVolumeSpecName: "scripts") pod "f4c79395-a929-4f0d-8aa7-05f24412baed" (UID: "f4c79395-a929-4f0d-8aa7-05f24412baed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.038140 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c79395-a929-4f0d-8aa7-05f24412baed-kube-api-access-mjnmp" (OuterVolumeSpecName: "kube-api-access-mjnmp") pod "f4c79395-a929-4f0d-8aa7-05f24412baed" (UID: "f4c79395-a929-4f0d-8aa7-05f24412baed"). InnerVolumeSpecName "kube-api-access-mjnmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.059562 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4c79395-a929-4f0d-8aa7-05f24412baed" (UID: "f4c79395-a929-4f0d-8aa7-05f24412baed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.067091 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-config-data" (OuterVolumeSpecName: "config-data") pod "f4c79395-a929-4f0d-8aa7-05f24412baed" (UID: "f4c79395-a929-4f0d-8aa7-05f24412baed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.126655 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjnmp\" (UniqueName: \"kubernetes.io/projected/f4c79395-a929-4f0d-8aa7-05f24412baed-kube-api-access-mjnmp\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.126708 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.126722 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.126734 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c79395-a929-4f0d-8aa7-05f24412baed-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.253125 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:39:59 crc kubenswrapper[4772]: W0127 16:39:59.257432 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1c55305_7d18_44cb_90e5_b6793989abda.slice/crio-8f15691a47c277f39b1b0ae53dfcc7f92ebe7de7d545b4f191aa38d15da7fb88 WatchSource:0}: Error finding container 8f15691a47c277f39b1b0ae53dfcc7f92ebe7de7d545b4f191aa38d15da7fb88: Status 404 returned error can't find the container with id 8f15691a47c277f39b1b0ae53dfcc7f92ebe7de7d545b4f191aa38d15da7fb88 Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.389333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-9vmft" event={"ID":"f4c79395-a929-4f0d-8aa7-05f24412baed","Type":"ContainerDied","Data":"52cbee2f5d4eb8ed937fe3376681b915bcc858435daf4a58153a4c6bb2cb935d"} Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.389827 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52cbee2f5d4eb8ed937fe3376681b915bcc858435daf4a58153a4c6bb2cb935d" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.389946 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-9vmft" Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.392086 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e1c55305-7d18-44cb-90e5-b6793989abda","Type":"ContainerStarted","Data":"8f15691a47c277f39b1b0ae53dfcc7f92ebe7de7d545b4f191aa38d15da7fb88"} Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.566231 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.566672 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-log" containerID="cri-o://0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635" gracePeriod=30 Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.566887 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-api" containerID="cri-o://d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473" gracePeriod=30 Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.575876 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.576114 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="26d4226f-e574-498f-a2ce-e0db8f83a8d3" containerName="nova-scheduler-scheduler" containerID="cri-o://d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58" gracePeriod=30 Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.636080 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.636451 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-metadata" containerID="cri-o://3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb" gracePeriod=30 Jan 27 16:39:59 crc kubenswrapper[4772]: I0127 16:39:59.636437 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-log" containerID="cri-o://43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a" gracePeriod=30 Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.153308 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.183876 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.248664 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e14057-d3be-433e-97a7-0b36f18382ee-logs\") pod \"e3e14057-d3be-433e-97a7-0b36f18382ee\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.248779 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfs5m\" (UniqueName: \"kubernetes.io/projected/e3e14057-d3be-433e-97a7-0b36f18382ee-kube-api-access-mfs5m\") pod \"e3e14057-d3be-433e-97a7-0b36f18382ee\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.248853 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-combined-ca-bundle\") pod \"e3e14057-d3be-433e-97a7-0b36f18382ee\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.248914 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-config-data\") pod \"e3e14057-d3be-433e-97a7-0b36f18382ee\" (UID: \"e3e14057-d3be-433e-97a7-0b36f18382ee\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.249192 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e14057-d3be-433e-97a7-0b36f18382ee-logs" (OuterVolumeSpecName: "logs") pod "e3e14057-d3be-433e-97a7-0b36f18382ee" (UID: "e3e14057-d3be-433e-97a7-0b36f18382ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.249471 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3e14057-d3be-433e-97a7-0b36f18382ee-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.264431 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e14057-d3be-433e-97a7-0b36f18382ee-kube-api-access-mfs5m" (OuterVolumeSpecName: "kube-api-access-mfs5m") pod "e3e14057-d3be-433e-97a7-0b36f18382ee" (UID: "e3e14057-d3be-433e-97a7-0b36f18382ee"). InnerVolumeSpecName "kube-api-access-mfs5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.273717 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3e14057-d3be-433e-97a7-0b36f18382ee" (UID: "e3e14057-d3be-433e-97a7-0b36f18382ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.274138 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-config-data" (OuterVolumeSpecName: "config-data") pod "e3e14057-d3be-433e-97a7-0b36f18382ee" (UID: "e3e14057-d3be-433e-97a7-0b36f18382ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.350412 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-config-data\") pod \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.350540 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f993d38-e5bb-4ce4-8a9f-269695614f5e-logs\") pod \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.350580 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-combined-ca-bundle\") pod \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.350701 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp727\" (UniqueName: \"kubernetes.io/projected/9f993d38-e5bb-4ce4-8a9f-269695614f5e-kube-api-access-bp727\") pod \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\" (UID: \"9f993d38-e5bb-4ce4-8a9f-269695614f5e\") " Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.351031 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f993d38-e5bb-4ce4-8a9f-269695614f5e-logs" (OuterVolumeSpecName: "logs") pod "9f993d38-e5bb-4ce4-8a9f-269695614f5e" (UID: "9f993d38-e5bb-4ce4-8a9f-269695614f5e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.351291 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfs5m\" (UniqueName: \"kubernetes.io/projected/e3e14057-d3be-433e-97a7-0b36f18382ee-kube-api-access-mfs5m\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.351318 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f993d38-e5bb-4ce4-8a9f-269695614f5e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.351336 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.351348 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e14057-d3be-433e-97a7-0b36f18382ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.353650 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f993d38-e5bb-4ce4-8a9f-269695614f5e-kube-api-access-bp727" (OuterVolumeSpecName: "kube-api-access-bp727") pod "9f993d38-e5bb-4ce4-8a9f-269695614f5e" (UID: "9f993d38-e5bb-4ce4-8a9f-269695614f5e"). InnerVolumeSpecName "kube-api-access-bp727". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.372201 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-config-data" (OuterVolumeSpecName: "config-data") pod "9f993d38-e5bb-4ce4-8a9f-269695614f5e" (UID: "9f993d38-e5bb-4ce4-8a9f-269695614f5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.374290 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f993d38-e5bb-4ce4-8a9f-269695614f5e" (UID: "9f993d38-e5bb-4ce4-8a9f-269695614f5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403125 4772 generic.go:334] "Generic (PLEG): container finished" podID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerID="3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb" exitCode=0 Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403151 4772 generic.go:334] "Generic (PLEG): container finished" podID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerID="43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a" exitCode=143 Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f993d38-e5bb-4ce4-8a9f-269695614f5e","Type":"ContainerDied","Data":"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f993d38-e5bb-4ce4-8a9f-269695614f5e","Type":"ContainerDied","Data":"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9f993d38-e5bb-4ce4-8a9f-269695614f5e","Type":"ContainerDied","Data":"03b13462263b2fb77b677576417743ca4d4b9b8dce8807867be92ad60cefd791"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403280 4772 scope.go:117] "RemoveContainer" containerID="3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.403386 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.414450 4772 generic.go:334] "Generic (PLEG): container finished" podID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerID="d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473" exitCode=0 Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.414478 4772 generic.go:334] "Generic (PLEG): container finished" podID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerID="0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635" exitCode=143 Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.414533 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e14057-d3be-433e-97a7-0b36f18382ee","Type":"ContainerDied","Data":"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.414584 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e14057-d3be-433e-97a7-0b36f18382ee","Type":"ContainerDied","Data":"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.414599 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3e14057-d3be-433e-97a7-0b36f18382ee","Type":"ContainerDied","Data":"44267cacdffa14217dc9859dca23e969af5b7adbb4750d9c4f117ba9ccd3c77f"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.414552 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.416240 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e1c55305-7d18-44cb-90e5-b6793989abda","Type":"ContainerStarted","Data":"a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81"} Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.416734 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.453832 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.453874 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp727\" (UniqueName: \"kubernetes.io/projected/9f993d38-e5bb-4ce4-8a9f-269695614f5e-kube-api-access-bp727\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.453887 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f993d38-e5bb-4ce4-8a9f-269695614f5e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.466746 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.466723809 podStartE2EDuration="2.466723809s" podCreationTimestamp="2026-01-27 16:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:00.457801254 +0000 UTC m=+5586.438410362" watchObservedRunningTime="2026-01-27 16:40:00.466723809 +0000 UTC m=+5586.447332907" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.631558 4772 scope.go:117] "RemoveContainer" containerID="43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.632335 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.646295 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.663271 4772 scope.go:117] "RemoveContainer" containerID="3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.663751 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb\": container with ID starting with 3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb not found: ID does not exist" containerID="3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.663783 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb"} err="failed to get container status \"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb\": rpc error: code = NotFound desc = could not find container \"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb\": container with ID starting with 3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.663802 4772 scope.go:117] "RemoveContainer" containerID="43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.666070 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a\": container with ID starting with 43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a not found: ID does not exist" containerID="43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.666099 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a"} err="failed to get container status \"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a\": rpc error: code = NotFound desc = could not find container \"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a\": container with ID starting with 43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.666114 4772 scope.go:117] "RemoveContainer" containerID="3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.667425 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb"} err="failed to get container status \"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb\": rpc error: code = NotFound desc = could not find container \"3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb\": container with ID starting with 3a53c1dc2aeb5bf544910214491023eeece1cfbdb9d21ada9d6cf8e1d2675ffb not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.667459 4772 scope.go:117] "RemoveContainer" containerID="43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.667722 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a"} err="failed to get container status \"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a\": rpc error: code = NotFound desc = could not find container \"43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a\": container with ID starting with 43c20e864a030473b88bc17f0d3fed51612d4bbdf0ff55280f36b4578127cc8a not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.667740 4772 scope.go:117] "RemoveContainer" containerID="d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.724575 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" path="/var/lib/kubelet/pods/9f993d38-e5bb-4ce4-8a9f-269695614f5e/volumes" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725251 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.725536 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c79395-a929-4f0d-8aa7-05f24412baed" containerName="nova-manage" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725555 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c79395-a929-4f0d-8aa7-05f24412baed" containerName="nova-manage" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.725567 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-metadata" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725573 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-metadata" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.725590 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-log" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725596 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-log" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.725607 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-api" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725613 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-api" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.725638 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-log" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725644 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-log" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725803 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-api" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725813 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-log" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725826 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" containerName="nova-api-log" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725832 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f993d38-e5bb-4ce4-8a9f-269695614f5e" containerName="nova-metadata-metadata" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.725845 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c79395-a929-4f0d-8aa7-05f24412baed" containerName="nova-manage" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.726764 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.726783 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.726796 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.726806 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.727225 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.728233 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.728313 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.729875 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.730002 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.765643 4772 scope.go:117] "RemoveContainer" containerID="0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.792834 4772 scope.go:117] "RemoveContainer" containerID="d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.793281 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473\": container with ID starting with d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473 not found: ID does not exist" containerID="d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.793343 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473"} err="failed to get container status \"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473\": rpc error: code = NotFound desc = could not find container \"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473\": container with ID starting with d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473 not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.793368 4772 scope.go:117] "RemoveContainer" containerID="0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635" Jan 27 16:40:00 crc kubenswrapper[4772]: E0127 16:40:00.793746 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635\": container with ID starting with 0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635 not found: ID does not exist" containerID="0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.793776 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635"} err="failed to get container status \"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635\": rpc error: code = NotFound desc = could not find container \"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635\": container with ID starting with 0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635 not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.793799 4772 scope.go:117] "RemoveContainer" containerID="d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.794027 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473"} err="failed to get container status \"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473\": rpc error: code = NotFound desc = could not find container \"d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473\": container with ID starting with d78776dd87426d6e9e796444fb4d98fcf0b08f9b784ed6f290516aa2ca7cd473 not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.794066 4772 scope.go:117] "RemoveContainer" containerID="0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.794290 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635"} err="failed to get container status \"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635\": rpc error: code = NotFound desc = could not find container \"0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635\": container with ID starting with 0271b9dbb1d2941b26f79c6625f660e41862d12903a306e76d443a4c1adc8635 not found: ID does not exist" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.865898 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576a6dc-0ce0-4e70-8be2-179989ed0c03-logs\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.865985 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r644m\" (UniqueName: \"kubernetes.io/projected/0576a6dc-0ce0-4e70-8be2-179989ed0c03-kube-api-access-r644m\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.866052 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-config-data\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.866094 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.866119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-config-data\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.866231 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.866251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601c20bf-c7e2-4c26-a2da-7ebce076775a-logs\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.866267 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rxm\" (UniqueName: \"kubernetes.io/projected/601c20bf-c7e2-4c26-a2da-7ebce076775a-kube-api-access-c9rxm\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.967936 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968012 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601c20bf-c7e2-4c26-a2da-7ebce076775a-logs\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968039 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rxm\" (UniqueName: \"kubernetes.io/projected/601c20bf-c7e2-4c26-a2da-7ebce076775a-kube-api-access-c9rxm\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968090 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576a6dc-0ce0-4e70-8be2-179989ed0c03-logs\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968139 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r644m\" (UniqueName: \"kubernetes.io/projected/0576a6dc-0ce0-4e70-8be2-179989ed0c03-kube-api-access-r644m\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968199 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-config-data\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968241 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-config-data\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.968884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576a6dc-0ce0-4e70-8be2-179989ed0c03-logs\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.969529 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601c20bf-c7e2-4c26-a2da-7ebce076775a-logs\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.972138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.973207 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-config-data\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.974214 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-config-data\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.981089 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:00 crc kubenswrapper[4772]: I0127 16:40:00.998833 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rxm\" (UniqueName: \"kubernetes.io/projected/601c20bf-c7e2-4c26-a2da-7ebce076775a-kube-api-access-c9rxm\") pod \"nova-api-0\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " pod="openstack/nova-api-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.000377 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.008002 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r644m\" (UniqueName: \"kubernetes.io/projected/0576a6dc-0ce0-4e70-8be2-179989ed0c03-kube-api-access-r644m\") pod \"nova-metadata-0\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " pod="openstack/nova-metadata-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.061246 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545c956d45-h25qs"] Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.061482 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-545c956d45-h25qs" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerName="dnsmasq-dns" containerID="cri-o://54f0c375865810e4cb819d2019e6c8ab7827175c7d633ed00deb75c9600fb4b9" gracePeriod=10 Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.082635 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.086123 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.102223 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.131456 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.438887 4772 generic.go:334] "Generic (PLEG): container finished" podID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerID="54f0c375865810e4cb819d2019e6c8ab7827175c7d633ed00deb75c9600fb4b9" exitCode=0 Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.438967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545c956d45-h25qs" event={"ID":"6da65d05-29e3-4d97-869f-d3386a45a38e","Type":"ContainerDied","Data":"54f0c375865810e4cb819d2019e6c8ab7827175c7d633ed00deb75c9600fb4b9"} Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.448672 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.514114 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.680226 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.683187 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-nb\") pod \"6da65d05-29e3-4d97-869f-d3386a45a38e\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.683244 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-config\") pod \"6da65d05-29e3-4d97-869f-d3386a45a38e\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.683308 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-sb\") pod \"6da65d05-29e3-4d97-869f-d3386a45a38e\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.683368 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh8vx\" (UniqueName: \"kubernetes.io/projected/6da65d05-29e3-4d97-869f-d3386a45a38e-kube-api-access-sh8vx\") pod \"6da65d05-29e3-4d97-869f-d3386a45a38e\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.683409 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-dns-svc\") pod \"6da65d05-29e3-4d97-869f-d3386a45a38e\" (UID: \"6da65d05-29e3-4d97-869f-d3386a45a38e\") " Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.689665 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.693765 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da65d05-29e3-4d97-869f-d3386a45a38e-kube-api-access-sh8vx" (OuterVolumeSpecName: "kube-api-access-sh8vx") pod "6da65d05-29e3-4d97-869f-d3386a45a38e" (UID: "6da65d05-29e3-4d97-869f-d3386a45a38e"). InnerVolumeSpecName "kube-api-access-sh8vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.741612 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6da65d05-29e3-4d97-869f-d3386a45a38e" (UID: "6da65d05-29e3-4d97-869f-d3386a45a38e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.748409 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6da65d05-29e3-4d97-869f-d3386a45a38e" (UID: "6da65d05-29e3-4d97-869f-d3386a45a38e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.770039 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-config" (OuterVolumeSpecName: "config") pod "6da65d05-29e3-4d97-869f-d3386a45a38e" (UID: "6da65d05-29e3-4d97-869f-d3386a45a38e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.785974 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh8vx\" (UniqueName: \"kubernetes.io/projected/6da65d05-29e3-4d97-869f-d3386a45a38e-kube-api-access-sh8vx\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.786230 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.786361 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.786458 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.787580 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6da65d05-29e3-4d97-869f-d3386a45a38e" (UID: "6da65d05-29e3-4d97-869f-d3386a45a38e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:01 crc kubenswrapper[4772]: I0127 16:40:01.889136 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6da65d05-29e3-4d97-869f-d3386a45a38e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.451664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0576a6dc-0ce0-4e70-8be2-179989ed0c03","Type":"ContainerStarted","Data":"a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.452072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0576a6dc-0ce0-4e70-8be2-179989ed0c03","Type":"ContainerStarted","Data":"b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.452097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0576a6dc-0ce0-4e70-8be2-179989ed0c03","Type":"ContainerStarted","Data":"272b9d5f7574afede07d814583b3f863aa00e225817941ab1c1713433efa66a6"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.455159 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-545c956d45-h25qs" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.455160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-545c956d45-h25qs" event={"ID":"6da65d05-29e3-4d97-869f-d3386a45a38e","Type":"ContainerDied","Data":"1b2cc021d3453413ef029602e151c491355c94bca54c6c90c56cd7685dc93518"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.455397 4772 scope.go:117] "RemoveContainer" containerID="54f0c375865810e4cb819d2019e6c8ab7827175c7d633ed00deb75c9600fb4b9" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.459362 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601c20bf-c7e2-4c26-a2da-7ebce076775a","Type":"ContainerStarted","Data":"6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.459399 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601c20bf-c7e2-4c26-a2da-7ebce076775a","Type":"ContainerStarted","Data":"eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.459421 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601c20bf-c7e2-4c26-a2da-7ebce076775a","Type":"ContainerStarted","Data":"c71aafaf39ad009fb7eb0ade8a547f4797867d098bc1c0cd4f5e22f563ee2d30"} Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.483954 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.483935716 podStartE2EDuration="2.483935716s" podCreationTimestamp="2026-01-27 16:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:02.472333486 +0000 UTC m=+5588.452942604" watchObservedRunningTime="2026-01-27 16:40:02.483935716 +0000 UTC m=+5588.464544814" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.498027 4772 scope.go:117] "RemoveContainer" containerID="b8551d53299337e281772b00e27dddd2da7a39a65b37574b860847ac8a5c90ba" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.503720 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.50370346 podStartE2EDuration="2.50370346s" podCreationTimestamp="2026-01-27 16:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:02.495739433 +0000 UTC m=+5588.476348531" watchObservedRunningTime="2026-01-27 16:40:02.50370346 +0000 UTC m=+5588.484312558" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.526381 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-545c956d45-h25qs"] Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.537830 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-545c956d45-h25qs"] Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.677476 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" path="/var/lib/kubelet/pods/6da65d05-29e3-4d97-869f-d3386a45a38e/volumes" Jan 27 16:40:02 crc kubenswrapper[4772]: I0127 16:40:02.678412 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e14057-d3be-433e-97a7-0b36f18382ee" path="/var/lib/kubelet/pods/e3e14057-d3be-433e-97a7-0b36f18382ee/volumes" Jan 27 16:40:03 crc kubenswrapper[4772]: I0127 16:40:03.871834 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.029381 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-config-data\") pod \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.029441 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-combined-ca-bundle\") pod \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.029488 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqgtr\" (UniqueName: \"kubernetes.io/projected/26d4226f-e574-498f-a2ce-e0db8f83a8d3-kube-api-access-tqgtr\") pod \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\" (UID: \"26d4226f-e574-498f-a2ce-e0db8f83a8d3\") " Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.035338 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d4226f-e574-498f-a2ce-e0db8f83a8d3-kube-api-access-tqgtr" (OuterVolumeSpecName: "kube-api-access-tqgtr") pod "26d4226f-e574-498f-a2ce-e0db8f83a8d3" (UID: "26d4226f-e574-498f-a2ce-e0db8f83a8d3"). InnerVolumeSpecName "kube-api-access-tqgtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.051854 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-config-data" (OuterVolumeSpecName: "config-data") pod "26d4226f-e574-498f-a2ce-e0db8f83a8d3" (UID: "26d4226f-e574-498f-a2ce-e0db8f83a8d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.053891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26d4226f-e574-498f-a2ce-e0db8f83a8d3" (UID: "26d4226f-e574-498f-a2ce-e0db8f83a8d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.131831 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.131875 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26d4226f-e574-498f-a2ce-e0db8f83a8d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.131894 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqgtr\" (UniqueName: \"kubernetes.io/projected/26d4226f-e574-498f-a2ce-e0db8f83a8d3-kube-api-access-tqgtr\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.478786 4772 generic.go:334] "Generic (PLEG): container finished" podID="26d4226f-e574-498f-a2ce-e0db8f83a8d3" containerID="d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58" exitCode=0 Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.478847 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.478856 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"26d4226f-e574-498f-a2ce-e0db8f83a8d3","Type":"ContainerDied","Data":"d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58"} Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.478927 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"26d4226f-e574-498f-a2ce-e0db8f83a8d3","Type":"ContainerDied","Data":"1c139ace8ceb66ebe7302c272fa99b828aba2d79a11db30142d7bb5b26746bbd"} Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.478961 4772 scope.go:117] "RemoveContainer" containerID="d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.499536 4772 scope.go:117] "RemoveContainer" containerID="d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58" Jan 27 16:40:04 crc kubenswrapper[4772]: E0127 16:40:04.500065 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58\": container with ID starting with d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58 not found: ID does not exist" containerID="d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.500113 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58"} err="failed to get container status \"d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58\": rpc error: code = NotFound desc = could not find container \"d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58\": container with ID starting with d6c1d8ec858944c73c32d29d0613f42bb401204fa7425a43dbda27a9aa01ac58 not found: ID does not exist" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.522344 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.534390 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.546948 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:04 crc kubenswrapper[4772]: E0127 16:40:04.547376 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d4226f-e574-498f-a2ce-e0db8f83a8d3" containerName="nova-scheduler-scheduler" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.547393 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d4226f-e574-498f-a2ce-e0db8f83a8d3" containerName="nova-scheduler-scheduler" Jan 27 16:40:04 crc kubenswrapper[4772]: E0127 16:40:04.547413 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerName="dnsmasq-dns" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.547421 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerName="dnsmasq-dns" Jan 27 16:40:04 crc kubenswrapper[4772]: E0127 16:40:04.547436 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerName="init" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.547442 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerName="init" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.547597 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d4226f-e574-498f-a2ce-e0db8f83a8d3" containerName="nova-scheduler-scheduler" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.547610 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da65d05-29e3-4d97-869f-d3386a45a38e" containerName="dnsmasq-dns" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.548261 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.550361 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.570678 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.641622 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.641727 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2khtr\" (UniqueName: \"kubernetes.io/projected/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-kube-api-access-2khtr\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.641788 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-config-data\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.684535 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d4226f-e574-498f-a2ce-e0db8f83a8d3" path="/var/lib/kubelet/pods/26d4226f-e574-498f-a2ce-e0db8f83a8d3/volumes" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.743945 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2khtr\" (UniqueName: \"kubernetes.io/projected/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-kube-api-access-2khtr\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.744120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-config-data\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.744354 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.751753 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.752349 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-config-data\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.761678 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2khtr\" (UniqueName: \"kubernetes.io/projected/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-kube-api-access-2khtr\") pod \"nova-scheduler-0\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:04 crc kubenswrapper[4772]: I0127 16:40:04.877267 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:05 crc kubenswrapper[4772]: I0127 16:40:05.356675 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:05 crc kubenswrapper[4772]: W0127 16:40:05.364918 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02c9bfb0_5f95_4692_bb48_ffd47f6682a8.slice/crio-3c6a3e07c72ed8c1bb7f2e42a623b437deaa88e1d9979eefafaa45fb0ff45499 WatchSource:0}: Error finding container 3c6a3e07c72ed8c1bb7f2e42a623b437deaa88e1d9979eefafaa45fb0ff45499: Status 404 returned error can't find the container with id 3c6a3e07c72ed8c1bb7f2e42a623b437deaa88e1d9979eefafaa45fb0ff45499 Jan 27 16:40:05 crc kubenswrapper[4772]: I0127 16:40:05.491523 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02c9bfb0-5f95-4692-bb48-ffd47f6682a8","Type":"ContainerStarted","Data":"3c6a3e07c72ed8c1bb7f2e42a623b437deaa88e1d9979eefafaa45fb0ff45499"} Jan 27 16:40:06 crc kubenswrapper[4772]: I0127 16:40:06.083330 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:40:06 crc kubenswrapper[4772]: I0127 16:40:06.083615 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:40:06 crc kubenswrapper[4772]: I0127 16:40:06.502126 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02c9bfb0-5f95-4692-bb48-ffd47f6682a8","Type":"ContainerStarted","Data":"6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336"} Jan 27 16:40:06 crc kubenswrapper[4772]: I0127 16:40:06.529233 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.529209142 podStartE2EDuration="2.529209142s" podCreationTimestamp="2026-01-27 16:40:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:06.518220459 +0000 UTC m=+5592.498829577" watchObservedRunningTime="2026-01-27 16:40:06.529209142 +0000 UTC m=+5592.509818260" Jan 27 16:40:07 crc kubenswrapper[4772]: I0127 16:40:07.079527 4772 scope.go:117] "RemoveContainer" containerID="1bbb692ab013dbd6aab71415abc05175b7c980ab72c48771e22a05f7a67573bd" Jan 27 16:40:08 crc kubenswrapper[4772]: I0127 16:40:08.807121 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.365628 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-p6mbs"] Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.367262 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.369261 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.371095 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.406927 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-p6mbs"] Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.429900 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-config-data\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.430084 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdxkp\" (UniqueName: \"kubernetes.io/projected/eaefb4fd-175a-4431-bd2f-7fc3205684b9-kube-api-access-vdxkp\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.430451 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.430506 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-scripts\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.532520 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.532571 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-scripts\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.532602 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-config-data\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.532693 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdxkp\" (UniqueName: \"kubernetes.io/projected/eaefb4fd-175a-4431-bd2f-7fc3205684b9-kube-api-access-vdxkp\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.539959 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-scripts\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.543793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-config-data\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.544383 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.554799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdxkp\" (UniqueName: \"kubernetes.io/projected/eaefb4fd-175a-4431-bd2f-7fc3205684b9-kube-api-access-vdxkp\") pod \"nova-cell1-cell-mapping-p6mbs\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.704927 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:09 crc kubenswrapper[4772]: I0127 16:40:09.878427 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 16:40:10 crc kubenswrapper[4772]: I0127 16:40:10.138060 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-p6mbs"] Jan 27 16:40:10 crc kubenswrapper[4772]: I0127 16:40:10.537734 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p6mbs" event={"ID":"eaefb4fd-175a-4431-bd2f-7fc3205684b9","Type":"ContainerStarted","Data":"f57fe908a2ad59651426657b881a5c35f1371cdc115dc93b86dbcb58952ce8a9"} Jan 27 16:40:10 crc kubenswrapper[4772]: I0127 16:40:10.538056 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p6mbs" event={"ID":"eaefb4fd-175a-4431-bd2f-7fc3205684b9","Type":"ContainerStarted","Data":"abe8b822cc40f21143a3b475b6bbd880c198650912757b0aefe22d25ba054986"} Jan 27 16:40:11 crc kubenswrapper[4772]: I0127 16:40:11.084309 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 16:40:11 crc kubenswrapper[4772]: I0127 16:40:11.084357 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 16:40:11 crc kubenswrapper[4772]: I0127 16:40:11.086776 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 16:40:11 crc kubenswrapper[4772]: I0127 16:40:11.086821 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 16:40:12 crc kubenswrapper[4772]: I0127 16:40:12.248416 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.61:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:12 crc kubenswrapper[4772]: I0127 16:40:12.248882 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.60:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:12 crc kubenswrapper[4772]: I0127 16:40:12.249130 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.60:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:12 crc kubenswrapper[4772]: I0127 16:40:12.249211 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.61:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:14 crc kubenswrapper[4772]: I0127 16:40:14.877943 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 16:40:14 crc kubenswrapper[4772]: I0127 16:40:14.905815 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 16:40:14 crc kubenswrapper[4772]: I0127 16:40:14.925977 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-p6mbs" podStartSLOduration=5.925956122 podStartE2EDuration="5.925956122s" podCreationTimestamp="2026-01-27 16:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:10.5686014 +0000 UTC m=+5596.549210498" watchObservedRunningTime="2026-01-27 16:40:14.925956122 +0000 UTC m=+5600.906565220" Jan 27 16:40:15 crc kubenswrapper[4772]: I0127 16:40:15.601089 4772 generic.go:334] "Generic (PLEG): container finished" podID="eaefb4fd-175a-4431-bd2f-7fc3205684b9" containerID="f57fe908a2ad59651426657b881a5c35f1371cdc115dc93b86dbcb58952ce8a9" exitCode=0 Jan 27 16:40:15 crc kubenswrapper[4772]: I0127 16:40:15.601193 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p6mbs" event={"ID":"eaefb4fd-175a-4431-bd2f-7fc3205684b9","Type":"ContainerDied","Data":"f57fe908a2ad59651426657b881a5c35f1371cdc115dc93b86dbcb58952ce8a9"} Jan 27 16:40:15 crc kubenswrapper[4772]: I0127 16:40:15.632608 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.915996 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.984508 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdxkp\" (UniqueName: \"kubernetes.io/projected/eaefb4fd-175a-4431-bd2f-7fc3205684b9-kube-api-access-vdxkp\") pod \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.984583 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-combined-ca-bundle\") pod \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.984631 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-config-data\") pod \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.984663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-scripts\") pod \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\" (UID: \"eaefb4fd-175a-4431-bd2f-7fc3205684b9\") " Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.991464 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaefb4fd-175a-4431-bd2f-7fc3205684b9-kube-api-access-vdxkp" (OuterVolumeSpecName: "kube-api-access-vdxkp") pod "eaefb4fd-175a-4431-bd2f-7fc3205684b9" (UID: "eaefb4fd-175a-4431-bd2f-7fc3205684b9"). InnerVolumeSpecName "kube-api-access-vdxkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:16 crc kubenswrapper[4772]: I0127 16:40:16.994024 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-scripts" (OuterVolumeSpecName: "scripts") pod "eaefb4fd-175a-4431-bd2f-7fc3205684b9" (UID: "eaefb4fd-175a-4431-bd2f-7fc3205684b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.010052 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaefb4fd-175a-4431-bd2f-7fc3205684b9" (UID: "eaefb4fd-175a-4431-bd2f-7fc3205684b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.015159 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-config-data" (OuterVolumeSpecName: "config-data") pod "eaefb4fd-175a-4431-bd2f-7fc3205684b9" (UID: "eaefb4fd-175a-4431-bd2f-7fc3205684b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.086497 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdxkp\" (UniqueName: \"kubernetes.io/projected/eaefb4fd-175a-4431-bd2f-7fc3205684b9-kube-api-access-vdxkp\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.086531 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.086541 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.086549 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaefb4fd-175a-4431-bd2f-7fc3205684b9-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.619834 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p6mbs" event={"ID":"eaefb4fd-175a-4431-bd2f-7fc3205684b9","Type":"ContainerDied","Data":"abe8b822cc40f21143a3b475b6bbd880c198650912757b0aefe22d25ba054986"} Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.620193 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abe8b822cc40f21143a3b475b6bbd880c198650912757b0aefe22d25ba054986" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.619925 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p6mbs" Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.807585 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.807852 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-log" containerID="cri-o://eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b" gracePeriod=30 Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.807933 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-api" containerID="cri-o://6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789" gracePeriod=30 Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.820046 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.820372 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" containerName="nova-scheduler-scheduler" containerID="cri-o://6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336" gracePeriod=30 Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.843632 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.843857 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-log" containerID="cri-o://b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a" gracePeriod=30 Jan 27 16:40:17 crc kubenswrapper[4772]: I0127 16:40:17.843887 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-metadata" containerID="cri-o://a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5" gracePeriod=30 Jan 27 16:40:18 crc kubenswrapper[4772]: I0127 16:40:18.628583 4772 generic.go:334] "Generic (PLEG): container finished" podID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerID="b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a" exitCode=143 Jan 27 16:40:18 crc kubenswrapper[4772]: I0127 16:40:18.628670 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0576a6dc-0ce0-4e70-8be2-179989ed0c03","Type":"ContainerDied","Data":"b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a"} Jan 27 16:40:18 crc kubenswrapper[4772]: I0127 16:40:18.630241 4772 generic.go:334] "Generic (PLEG): container finished" podID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerID="eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b" exitCode=143 Jan 27 16:40:18 crc kubenswrapper[4772]: I0127 16:40:18.630322 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601c20bf-c7e2-4c26-a2da-7ebce076775a","Type":"ContainerDied","Data":"eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b"} Jan 27 16:40:19 crc kubenswrapper[4772]: E0127 16:40:19.879501 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 16:40:19 crc kubenswrapper[4772]: E0127 16:40:19.880929 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 16:40:19 crc kubenswrapper[4772]: E0127 16:40:19.884287 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 16:40:19 crc kubenswrapper[4772]: E0127 16:40:19.884358 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" containerName="nova-scheduler-scheduler" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.443918 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.452206 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.580864 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9rxm\" (UniqueName: \"kubernetes.io/projected/601c20bf-c7e2-4c26-a2da-7ebce076775a-kube-api-access-c9rxm\") pod \"601c20bf-c7e2-4c26-a2da-7ebce076775a\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.580940 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-combined-ca-bundle\") pod \"601c20bf-c7e2-4c26-a2da-7ebce076775a\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.580973 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-config-data\") pod \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.581015 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576a6dc-0ce0-4e70-8be2-179989ed0c03-logs\") pod \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.581036 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-config-data\") pod \"601c20bf-c7e2-4c26-a2da-7ebce076775a\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.581058 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-combined-ca-bundle\") pod \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.581113 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601c20bf-c7e2-4c26-a2da-7ebce076775a-logs\") pod \"601c20bf-c7e2-4c26-a2da-7ebce076775a\" (UID: \"601c20bf-c7e2-4c26-a2da-7ebce076775a\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.581228 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r644m\" (UniqueName: \"kubernetes.io/projected/0576a6dc-0ce0-4e70-8be2-179989ed0c03-kube-api-access-r644m\") pod \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\" (UID: \"0576a6dc-0ce0-4e70-8be2-179989ed0c03\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.582947 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/601c20bf-c7e2-4c26-a2da-7ebce076775a-logs" (OuterVolumeSpecName: "logs") pod "601c20bf-c7e2-4c26-a2da-7ebce076775a" (UID: "601c20bf-c7e2-4c26-a2da-7ebce076775a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.585376 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0576a6dc-0ce0-4e70-8be2-179989ed0c03-logs" (OuterVolumeSpecName: "logs") pod "0576a6dc-0ce0-4e70-8be2-179989ed0c03" (UID: "0576a6dc-0ce0-4e70-8be2-179989ed0c03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.588388 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/601c20bf-c7e2-4c26-a2da-7ebce076775a-kube-api-access-c9rxm" (OuterVolumeSpecName: "kube-api-access-c9rxm") pod "601c20bf-c7e2-4c26-a2da-7ebce076775a" (UID: "601c20bf-c7e2-4c26-a2da-7ebce076775a"). InnerVolumeSpecName "kube-api-access-c9rxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.588438 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0576a6dc-0ce0-4e70-8be2-179989ed0c03-kube-api-access-r644m" (OuterVolumeSpecName: "kube-api-access-r644m") pod "0576a6dc-0ce0-4e70-8be2-179989ed0c03" (UID: "0576a6dc-0ce0-4e70-8be2-179989ed0c03"). InnerVolumeSpecName "kube-api-access-r644m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.606434 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0576a6dc-0ce0-4e70-8be2-179989ed0c03" (UID: "0576a6dc-0ce0-4e70-8be2-179989ed0c03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.606662 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "601c20bf-c7e2-4c26-a2da-7ebce076775a" (UID: "601c20bf-c7e2-4c26-a2da-7ebce076775a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.607517 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-config-data" (OuterVolumeSpecName: "config-data") pod "0576a6dc-0ce0-4e70-8be2-179989ed0c03" (UID: "0576a6dc-0ce0-4e70-8be2-179989ed0c03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.607880 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-config-data" (OuterVolumeSpecName: "config-data") pod "601c20bf-c7e2-4c26-a2da-7ebce076775a" (UID: "601c20bf-c7e2-4c26-a2da-7ebce076775a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.655568 4772 generic.go:334] "Generic (PLEG): container finished" podID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" containerID="6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336" exitCode=0 Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.655636 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02c9bfb0-5f95-4692-bb48-ffd47f6682a8","Type":"ContainerDied","Data":"6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336"} Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.657441 4772 generic.go:334] "Generic (PLEG): container finished" podID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerID="6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789" exitCode=0 Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.657501 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601c20bf-c7e2-4c26-a2da-7ebce076775a","Type":"ContainerDied","Data":"6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789"} Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.657522 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"601c20bf-c7e2-4c26-a2da-7ebce076775a","Type":"ContainerDied","Data":"c71aafaf39ad009fb7eb0ade8a547f4797867d098bc1c0cd4f5e22f563ee2d30"} Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.657520 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.657543 4772 scope.go:117] "RemoveContainer" containerID="6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.659681 4772 generic.go:334] "Generic (PLEG): container finished" podID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerID="a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5" exitCode=0 Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.659731 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0576a6dc-0ce0-4e70-8be2-179989ed0c03","Type":"ContainerDied","Data":"a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5"} Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.660387 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0576a6dc-0ce0-4e70-8be2-179989ed0c03","Type":"ContainerDied","Data":"272b9d5f7574afede07d814583b3f863aa00e225817941ab1c1713433efa66a6"} Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.659758 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684213 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684271 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684281 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0576a6dc-0ce0-4e70-8be2-179989ed0c03-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684289 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/601c20bf-c7e2-4c26-a2da-7ebce076775a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684298 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0576a6dc-0ce0-4e70-8be2-179989ed0c03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684305 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/601c20bf-c7e2-4c26-a2da-7ebce076775a-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684333 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r644m\" (UniqueName: \"kubernetes.io/projected/0576a6dc-0ce0-4e70-8be2-179989ed0c03-kube-api-access-r644m\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.684346 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9rxm\" (UniqueName: \"kubernetes.io/projected/601c20bf-c7e2-4c26-a2da-7ebce076775a-kube-api-access-c9rxm\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.712283 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.716624 4772 scope.go:117] "RemoveContainer" containerID="eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.727865 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.741354 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.749361 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.758245 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.758779 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-log" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.758795 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-log" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.758821 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaefb4fd-175a-4431-bd2f-7fc3205684b9" containerName="nova-manage" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.758829 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaefb4fd-175a-4431-bd2f-7fc3205684b9" containerName="nova-manage" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.758845 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-metadata" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.758856 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-metadata" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.758875 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-log" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.758882 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-log" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.758904 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-api" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.758911 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-api" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.759160 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-api" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.759194 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaefb4fd-175a-4431-bd2f-7fc3205684b9" containerName="nova-manage" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.759208 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" containerName="nova-api-log" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.759222 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-log" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.759236 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" containerName="nova-metadata-metadata" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.760379 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.765960 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.771184 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.773635 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.777830 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.778017 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.791103 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.794715 4772 scope.go:117] "RemoveContainer" containerID="6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.802262 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789\": container with ID starting with 6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789 not found: ID does not exist" containerID="6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.802307 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789"} err="failed to get container status \"6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789\": rpc error: code = NotFound desc = could not find container \"6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789\": container with ID starting with 6905111d10a31c85326f0556bb66e5eba75c934131e8d3fe500da3f080ca6789 not found: ID does not exist" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.802330 4772 scope.go:117] "RemoveContainer" containerID="eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.805512 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b\": container with ID starting with eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b not found: ID does not exist" containerID="eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.805556 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b"} err="failed to get container status \"eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b\": rpc error: code = NotFound desc = could not find container \"eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b\": container with ID starting with eae5df57a84c79de36f586452630170afcbfff0b1e0fdb27c896a858c117ff8b not found: ID does not exist" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.805579 4772 scope.go:117] "RemoveContainer" containerID="a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.832098 4772 scope.go:117] "RemoveContainer" containerID="b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.854482 4772 scope.go:117] "RemoveContainer" containerID="a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.855108 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5\": container with ID starting with a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5 not found: ID does not exist" containerID="a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.855154 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5"} err="failed to get container status \"a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5\": rpc error: code = NotFound desc = could not find container \"a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5\": container with ID starting with a0a5b8400ff9131c6a732b564464e4682a47bd72327030af67266b99caae0aa5 not found: ID does not exist" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.855200 4772 scope.go:117] "RemoveContainer" containerID="b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a" Jan 27 16:40:21 crc kubenswrapper[4772]: E0127 16:40:21.855480 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a\": container with ID starting with b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a not found: ID does not exist" containerID="b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.855519 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a"} err="failed to get container status \"b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a\": rpc error: code = NotFound desc = could not find container \"b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a\": container with ID starting with b99ac9784926d43b5b9010bb3bf145d936b87ce053aad43267f30ea8ebbbf38a not found: ID does not exist" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.886972 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vg99\" (UniqueName: \"kubernetes.io/projected/a518103b-f26e-4f91-9ca9-93f1f8d5e113-kube-api-access-8vg99\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887071 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887127 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a518103b-f26e-4f91-9ca9-93f1f8d5e113-logs\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-config-data\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887189 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-config-data\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887215 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cfjb\" (UniqueName: \"kubernetes.io/projected/8c652921-712d-46ee-9683-fd6312e33d1e-kube-api-access-7cfjb\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.887257 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c652921-712d-46ee-9683-fd6312e33d1e-logs\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.933657 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988266 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2khtr\" (UniqueName: \"kubernetes.io/projected/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-kube-api-access-2khtr\") pod \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988397 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-combined-ca-bundle\") pod \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988450 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-config-data\") pod \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\" (UID: \"02c9bfb0-5f95-4692-bb48-ffd47f6682a8\") " Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988701 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-config-data\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988720 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-config-data\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988738 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cfjb\" (UniqueName: \"kubernetes.io/projected/8c652921-712d-46ee-9683-fd6312e33d1e-kube-api-access-7cfjb\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988777 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c652921-712d-46ee-9683-fd6312e33d1e-logs\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988821 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988846 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vg99\" (UniqueName: \"kubernetes.io/projected/a518103b-f26e-4f91-9ca9-93f1f8d5e113-kube-api-access-8vg99\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988866 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.988935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a518103b-f26e-4f91-9ca9-93f1f8d5e113-logs\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.989406 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a518103b-f26e-4f91-9ca9-93f1f8d5e113-logs\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.989850 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c652921-712d-46ee-9683-fd6312e33d1e-logs\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.994972 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.994972 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:21 crc kubenswrapper[4772]: I0127 16:40:21.995473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-config-data\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.002366 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-kube-api-access-2khtr" (OuterVolumeSpecName: "kube-api-access-2khtr") pod "02c9bfb0-5f95-4692-bb48-ffd47f6682a8" (UID: "02c9bfb0-5f95-4692-bb48-ffd47f6682a8"). InnerVolumeSpecName "kube-api-access-2khtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.003632 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-config-data\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.014341 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cfjb\" (UniqueName: \"kubernetes.io/projected/8c652921-712d-46ee-9683-fd6312e33d1e-kube-api-access-7cfjb\") pod \"nova-metadata-0\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " pod="openstack/nova-metadata-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.014774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vg99\" (UniqueName: \"kubernetes.io/projected/a518103b-f26e-4f91-9ca9-93f1f8d5e113-kube-api-access-8vg99\") pod \"nova-api-0\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " pod="openstack/nova-api-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.022569 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02c9bfb0-5f95-4692-bb48-ffd47f6682a8" (UID: "02c9bfb0-5f95-4692-bb48-ffd47f6682a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.027404 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-config-data" (OuterVolumeSpecName: "config-data") pod "02c9bfb0-5f95-4692-bb48-ffd47f6682a8" (UID: "02c9bfb0-5f95-4692-bb48-ffd47f6682a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.090306 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2khtr\" (UniqueName: \"kubernetes.io/projected/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-kube-api-access-2khtr\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.090336 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.090344 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02c9bfb0-5f95-4692-bb48-ffd47f6682a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.091967 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.099990 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:40:22 crc kubenswrapper[4772]: W0127 16:40:22.571103 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c652921_712d_46ee_9683_fd6312e33d1e.slice/crio-8059f8ca79e1f853876a311e130550fa4d5e0bfbe87838df84c9d7f7b757e1d9 WatchSource:0}: Error finding container 8059f8ca79e1f853876a311e130550fa4d5e0bfbe87838df84c9d7f7b757e1d9: Status 404 returned error can't find the container with id 8059f8ca79e1f853876a311e130550fa4d5e0bfbe87838df84c9d7f7b757e1d9 Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.572620 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.613646 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.678794 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0576a6dc-0ce0-4e70-8be2-179989ed0c03" path="/var/lib/kubelet/pods/0576a6dc-0ce0-4e70-8be2-179989ed0c03/volumes" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.679538 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="601c20bf-c7e2-4c26-a2da-7ebce076775a" path="/var/lib/kubelet/pods/601c20bf-c7e2-4c26-a2da-7ebce076775a/volumes" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.680203 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.680555 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c652921-712d-46ee-9683-fd6312e33d1e","Type":"ContainerStarted","Data":"8059f8ca79e1f853876a311e130550fa4d5e0bfbe87838df84c9d7f7b757e1d9"} Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.680589 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a518103b-f26e-4f91-9ca9-93f1f8d5e113","Type":"ContainerStarted","Data":"ac2ae0550771f81842cdb596165399767b9e27c96df8f6a9cc1d4cc43f6c15e1"} Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.680605 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02c9bfb0-5f95-4692-bb48-ffd47f6682a8","Type":"ContainerDied","Data":"3c6a3e07c72ed8c1bb7f2e42a623b437deaa88e1d9979eefafaa45fb0ff45499"} Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.680625 4772 scope.go:117] "RemoveContainer" containerID="6f9e0ed02577ed79c4498daebc32e5d9adfca8bf0094d3a8ddb00a6642f3d336" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.738449 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.746129 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.755042 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:22 crc kubenswrapper[4772]: E0127 16:40:22.755667 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" containerName="nova-scheduler-scheduler" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.755685 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" containerName="nova-scheduler-scheduler" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.755848 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" containerName="nova-scheduler-scheduler" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.756477 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.762726 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.770382 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.802552 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.802709 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ckmf\" (UniqueName: \"kubernetes.io/projected/ce725b6a-06bb-4339-819f-fee8819078f0-kube-api-access-7ckmf\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.802759 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-config-data\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.904586 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ckmf\" (UniqueName: \"kubernetes.io/projected/ce725b6a-06bb-4339-819f-fee8819078f0-kube-api-access-7ckmf\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.904670 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-config-data\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.904704 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.910125 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.910319 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-config-data\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:22 crc kubenswrapper[4772]: I0127 16:40:22.921974 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ckmf\" (UniqueName: \"kubernetes.io/projected/ce725b6a-06bb-4339-819f-fee8819078f0-kube-api-access-7ckmf\") pod \"nova-scheduler-0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " pod="openstack/nova-scheduler-0" Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.077945 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.524815 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.690446 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a518103b-f26e-4f91-9ca9-93f1f8d5e113","Type":"ContainerStarted","Data":"67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075"} Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.690490 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a518103b-f26e-4f91-9ca9-93f1f8d5e113","Type":"ContainerStarted","Data":"8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0"} Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.693881 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce725b6a-06bb-4339-819f-fee8819078f0","Type":"ContainerStarted","Data":"33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c"} Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.693918 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce725b6a-06bb-4339-819f-fee8819078f0","Type":"ContainerStarted","Data":"c5ffffe294d87ab27553633999a0d84408edaf82429449f00a94658621086fd0"} Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.696916 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c652921-712d-46ee-9683-fd6312e33d1e","Type":"ContainerStarted","Data":"b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193"} Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.696955 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c652921-712d-46ee-9683-fd6312e33d1e","Type":"ContainerStarted","Data":"193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4"} Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.710125 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7101066940000003 podStartE2EDuration="2.710106694s" podCreationTimestamp="2026-01-27 16:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:23.707543661 +0000 UTC m=+5609.688152779" watchObservedRunningTime="2026-01-27 16:40:23.710106694 +0000 UTC m=+5609.690715792" Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.728138 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.728117408 podStartE2EDuration="2.728117408s" podCreationTimestamp="2026-01-27 16:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:23.724613108 +0000 UTC m=+5609.705222236" watchObservedRunningTime="2026-01-27 16:40:23.728117408 +0000 UTC m=+5609.708726506" Jan 27 16:40:23 crc kubenswrapper[4772]: I0127 16:40:23.749583 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.749558399 podStartE2EDuration="1.749558399s" podCreationTimestamp="2026-01-27 16:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:23.740930343 +0000 UTC m=+5609.721539451" watchObservedRunningTime="2026-01-27 16:40:23.749558399 +0000 UTC m=+5609.730167507" Jan 27 16:40:24 crc kubenswrapper[4772]: I0127 16:40:24.705131 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c9bfb0-5f95-4692-bb48-ffd47f6682a8" path="/var/lib/kubelet/pods/02c9bfb0-5f95-4692-bb48-ffd47f6682a8/volumes" Jan 27 16:40:27 crc kubenswrapper[4772]: I0127 16:40:27.092694 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:40:27 crc kubenswrapper[4772]: I0127 16:40:27.093008 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:40:28 crc kubenswrapper[4772]: I0127 16:40:28.078546 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 16:40:32 crc kubenswrapper[4772]: I0127 16:40:32.093416 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 16:40:32 crc kubenswrapper[4772]: I0127 16:40:32.093955 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 16:40:32 crc kubenswrapper[4772]: I0127 16:40:32.100941 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 16:40:32 crc kubenswrapper[4772]: I0127 16:40:32.100985 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.078825 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.109544 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.217451 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.65:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.258432 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.65:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.258719 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.64:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.258884 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.64:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:40:33 crc kubenswrapper[4772]: I0127 16:40:33.831868 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.058666 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.059397 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.094820 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.095275 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.096761 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.104205 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.104563 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.107838 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.111743 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.898497 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.921259 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 16:40:42 crc kubenswrapper[4772]: I0127 16:40:42.961785 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.250317 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c56cfbf-9f62r"] Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.251989 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.262782 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c56cfbf-9f62r"] Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.308831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-sb\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.308907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-dns-svc\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.308935 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-config\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.309004 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dkhk\" (UniqueName: \"kubernetes.io/projected/487e140f-d3fb-4ece-a41c-7a1c55a37534-kube-api-access-2dkhk\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.309057 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-nb\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.410933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-dns-svc\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.410972 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-config\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.411038 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dkhk\" (UniqueName: \"kubernetes.io/projected/487e140f-d3fb-4ece-a41c-7a1c55a37534-kube-api-access-2dkhk\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.411073 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-nb\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.411166 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-sb\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.411930 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-config\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.412076 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-dns-svc\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.412237 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-sb\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.412777 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-nb\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.435477 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dkhk\" (UniqueName: \"kubernetes.io/projected/487e140f-d3fb-4ece-a41c-7a1c55a37534-kube-api-access-2dkhk\") pod \"dnsmasq-dns-89c56cfbf-9f62r\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:43 crc kubenswrapper[4772]: I0127 16:40:43.623316 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:44 crc kubenswrapper[4772]: I0127 16:40:44.095324 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c56cfbf-9f62r"] Jan 27 16:40:44 crc kubenswrapper[4772]: I0127 16:40:44.927157 4772 generic.go:334] "Generic (PLEG): container finished" podID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerID="3b284b2b9feaaf7731a5ba79953b14bcd6f7aec90b9b76040821cf8f6a1cb2f9" exitCode=0 Jan 27 16:40:44 crc kubenswrapper[4772]: I0127 16:40:44.927288 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" event={"ID":"487e140f-d3fb-4ece-a41c-7a1c55a37534","Type":"ContainerDied","Data":"3b284b2b9feaaf7731a5ba79953b14bcd6f7aec90b9b76040821cf8f6a1cb2f9"} Jan 27 16:40:44 crc kubenswrapper[4772]: I0127 16:40:44.928318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" event={"ID":"487e140f-d3fb-4ece-a41c-7a1c55a37534","Type":"ContainerStarted","Data":"e6f437da52e8342f357b6a59971f74fc9403dcd77f6fa7a76fcf902ca6dd800f"} Jan 27 16:40:45 crc kubenswrapper[4772]: I0127 16:40:45.942721 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" event={"ID":"487e140f-d3fb-4ece-a41c-7a1c55a37534","Type":"ContainerStarted","Data":"b740d63c9d1013074313725144a776e8a3b5f1fd4cbe0771f67139ace29cab8f"} Jan 27 16:40:45 crc kubenswrapper[4772]: I0127 16:40:45.944314 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:45 crc kubenswrapper[4772]: I0127 16:40:45.994000 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" podStartSLOduration=2.993977669 podStartE2EDuration="2.993977669s" podCreationTimestamp="2026-01-27 16:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:40:45.985442756 +0000 UTC m=+5631.966051874" watchObservedRunningTime="2026-01-27 16:40:45.993977669 +0000 UTC m=+5631.974586767" Jan 27 16:40:53 crc kubenswrapper[4772]: I0127 16:40:53.625489 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:40:53 crc kubenswrapper[4772]: I0127 16:40:53.708941 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7758766f57-6fk67"] Jan 27 16:40:53 crc kubenswrapper[4772]: I0127 16:40:53.712525 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7758766f57-6fk67" podUID="f90709ae-7118-42ca-aad1-922961a6f858" containerName="dnsmasq-dns" containerID="cri-o://97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5" gracePeriod=10 Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.193765 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.369309 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-nb\") pod \"f90709ae-7118-42ca-aad1-922961a6f858\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.369361 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-dns-svc\") pod \"f90709ae-7118-42ca-aad1-922961a6f858\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.369400 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-sb\") pod \"f90709ae-7118-42ca-aad1-922961a6f858\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.369419 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-config\") pod \"f90709ae-7118-42ca-aad1-922961a6f858\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.369527 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv5xf\" (UniqueName: \"kubernetes.io/projected/f90709ae-7118-42ca-aad1-922961a6f858-kube-api-access-bv5xf\") pod \"f90709ae-7118-42ca-aad1-922961a6f858\" (UID: \"f90709ae-7118-42ca-aad1-922961a6f858\") " Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.374504 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90709ae-7118-42ca-aad1-922961a6f858-kube-api-access-bv5xf" (OuterVolumeSpecName: "kube-api-access-bv5xf") pod "f90709ae-7118-42ca-aad1-922961a6f858" (UID: "f90709ae-7118-42ca-aad1-922961a6f858"). InnerVolumeSpecName "kube-api-access-bv5xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.425337 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f90709ae-7118-42ca-aad1-922961a6f858" (UID: "f90709ae-7118-42ca-aad1-922961a6f858"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.427790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-config" (OuterVolumeSpecName: "config") pod "f90709ae-7118-42ca-aad1-922961a6f858" (UID: "f90709ae-7118-42ca-aad1-922961a6f858"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.439667 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f90709ae-7118-42ca-aad1-922961a6f858" (UID: "f90709ae-7118-42ca-aad1-922961a6f858"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.448861 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f90709ae-7118-42ca-aad1-922961a6f858" (UID: "f90709ae-7118-42ca-aad1-922961a6f858"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.463483 4772 generic.go:334] "Generic (PLEG): container finished" podID="f90709ae-7118-42ca-aad1-922961a6f858" containerID="97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5" exitCode=0 Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.463526 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7758766f57-6fk67" event={"ID":"f90709ae-7118-42ca-aad1-922961a6f858","Type":"ContainerDied","Data":"97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5"} Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.463551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7758766f57-6fk67" event={"ID":"f90709ae-7118-42ca-aad1-922961a6f858","Type":"ContainerDied","Data":"b319140f655aa1b23b1e685173361086b6144080f0c58e842877b6cfd23b2922"} Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.463570 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7758766f57-6fk67" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.463624 4772 scope.go:117] "RemoveContainer" containerID="97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.471252 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.471276 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.471286 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.471294 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f90709ae-7118-42ca-aad1-922961a6f858-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.471303 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv5xf\" (UniqueName: \"kubernetes.io/projected/f90709ae-7118-42ca-aad1-922961a6f858-kube-api-access-bv5xf\") on node \"crc\" DevicePath \"\"" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.499779 4772 scope.go:117] "RemoveContainer" containerID="ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.506261 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7758766f57-6fk67"] Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.514305 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7758766f57-6fk67"] Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.519920 4772 scope.go:117] "RemoveContainer" containerID="97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5" Jan 27 16:40:54 crc kubenswrapper[4772]: E0127 16:40:54.520448 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5\": container with ID starting with 97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5 not found: ID does not exist" containerID="97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.520483 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5"} err="failed to get container status \"97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5\": rpc error: code = NotFound desc = could not find container \"97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5\": container with ID starting with 97e6d0f1812bfdc92938ed911c0ea776bf32e0bb2e1f04fe8e722bdbc3e2b5e5 not found: ID does not exist" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.520507 4772 scope.go:117] "RemoveContainer" containerID="ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51" Jan 27 16:40:54 crc kubenswrapper[4772]: E0127 16:40:54.520814 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51\": container with ID starting with ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51 not found: ID does not exist" containerID="ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.520836 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51"} err="failed to get container status \"ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51\": rpc error: code = NotFound desc = could not find container \"ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51\": container with ID starting with ea368eea29b6534237f47789a13e7bcf3cb4d11f06fe750858483cf68a03ec51 not found: ID does not exist" Jan 27 16:40:54 crc kubenswrapper[4772]: I0127 16:40:54.676513 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f90709ae-7118-42ca-aad1-922961a6f858" path="/var/lib/kubelet/pods/f90709ae-7118-42ca-aad1-922961a6f858/volumes" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.007359 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-hk6d6"] Jan 27 16:40:56 crc kubenswrapper[4772]: E0127 16:40:56.008009 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f90709ae-7118-42ca-aad1-922961a6f858" containerName="init" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.008023 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90709ae-7118-42ca-aad1-922961a6f858" containerName="init" Jan 27 16:40:56 crc kubenswrapper[4772]: E0127 16:40:56.008048 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f90709ae-7118-42ca-aad1-922961a6f858" containerName="dnsmasq-dns" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.008054 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f90709ae-7118-42ca-aad1-922961a6f858" containerName="dnsmasq-dns" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.008234 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f90709ae-7118-42ca-aad1-922961a6f858" containerName="dnsmasq-dns" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.008875 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.023856 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hk6d6"] Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.099375 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-e17c-account-create-update-4ccmc"] Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.100600 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.102866 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.113598 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e17c-account-create-update-4ccmc"] Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.201565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/357c50eb-1246-4dd8-975c-b10d09439cbd-operator-scripts\") pod \"cinder-db-create-hk6d6\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.201633 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzz8\" (UniqueName: \"kubernetes.io/projected/357c50eb-1246-4dd8-975c-b10d09439cbd-kube-api-access-wlzz8\") pod \"cinder-db-create-hk6d6\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.304104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjgt\" (UniqueName: \"kubernetes.io/projected/d67b98cc-0659-4ebd-a96a-025044731558-kube-api-access-wsjgt\") pod \"cinder-e17c-account-create-update-4ccmc\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.304211 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67b98cc-0659-4ebd-a96a-025044731558-operator-scripts\") pod \"cinder-e17c-account-create-update-4ccmc\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.304287 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/357c50eb-1246-4dd8-975c-b10d09439cbd-operator-scripts\") pod \"cinder-db-create-hk6d6\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.304315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzz8\" (UniqueName: \"kubernetes.io/projected/357c50eb-1246-4dd8-975c-b10d09439cbd-kube-api-access-wlzz8\") pod \"cinder-db-create-hk6d6\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.305373 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/357c50eb-1246-4dd8-975c-b10d09439cbd-operator-scripts\") pod \"cinder-db-create-hk6d6\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.321187 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzz8\" (UniqueName: \"kubernetes.io/projected/357c50eb-1246-4dd8-975c-b10d09439cbd-kube-api-access-wlzz8\") pod \"cinder-db-create-hk6d6\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.333472 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hk6d6" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.406347 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67b98cc-0659-4ebd-a96a-025044731558-operator-scripts\") pod \"cinder-e17c-account-create-update-4ccmc\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.407117 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjgt\" (UniqueName: \"kubernetes.io/projected/d67b98cc-0659-4ebd-a96a-025044731558-kube-api-access-wsjgt\") pod \"cinder-e17c-account-create-update-4ccmc\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.407236 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67b98cc-0659-4ebd-a96a-025044731558-operator-scripts\") pod \"cinder-e17c-account-create-update-4ccmc\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:56 crc kubenswrapper[4772]: I0127 16:40:56.430044 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjgt\" (UniqueName: \"kubernetes.io/projected/d67b98cc-0659-4ebd-a96a-025044731558-kube-api-access-wsjgt\") pod \"cinder-e17c-account-create-update-4ccmc\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:57 crc kubenswrapper[4772]: I0127 16:40:56.718284 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:40:57 crc kubenswrapper[4772]: I0127 16:40:57.398691 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-hk6d6"] Jan 27 16:40:57 crc kubenswrapper[4772]: I0127 16:40:57.492667 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hk6d6" event={"ID":"357c50eb-1246-4dd8-975c-b10d09439cbd","Type":"ContainerStarted","Data":"376bf0f983c24165f1737a8abc049bd61ef568f18bb3826a169d2c905cf7cb55"} Jan 27 16:40:57 crc kubenswrapper[4772]: I0127 16:40:57.505979 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-e17c-account-create-update-4ccmc"] Jan 27 16:40:58 crc kubenswrapper[4772]: I0127 16:40:58.503075 4772 generic.go:334] "Generic (PLEG): container finished" podID="357c50eb-1246-4dd8-975c-b10d09439cbd" containerID="2746702aa9709faf5d661e469190b51cee57d01ba7b40596d6751ad98441dd4d" exitCode=0 Jan 27 16:40:58 crc kubenswrapper[4772]: I0127 16:40:58.503133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hk6d6" event={"ID":"357c50eb-1246-4dd8-975c-b10d09439cbd","Type":"ContainerDied","Data":"2746702aa9709faf5d661e469190b51cee57d01ba7b40596d6751ad98441dd4d"} Jan 27 16:40:58 crc kubenswrapper[4772]: I0127 16:40:58.505264 4772 generic.go:334] "Generic (PLEG): container finished" podID="d67b98cc-0659-4ebd-a96a-025044731558" containerID="dab4307441a9ab5d1178a5547c03edae0ead3f783db4ef2bf29e8414026bb08f" exitCode=0 Jan 27 16:40:58 crc kubenswrapper[4772]: I0127 16:40:58.505310 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e17c-account-create-update-4ccmc" event={"ID":"d67b98cc-0659-4ebd-a96a-025044731558","Type":"ContainerDied","Data":"dab4307441a9ab5d1178a5547c03edae0ead3f783db4ef2bf29e8414026bb08f"} Jan 27 16:40:58 crc kubenswrapper[4772]: I0127 16:40:58.505339 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e17c-account-create-update-4ccmc" event={"ID":"d67b98cc-0659-4ebd-a96a-025044731558","Type":"ContainerStarted","Data":"e8c6958939d862ea28c0f21a1d8ba7e761ac273dda2bdefeb7b657d0d56e4fa0"} Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.008568 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.017999 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hk6d6" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.175245 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjgt\" (UniqueName: \"kubernetes.io/projected/d67b98cc-0659-4ebd-a96a-025044731558-kube-api-access-wsjgt\") pod \"d67b98cc-0659-4ebd-a96a-025044731558\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.175308 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/357c50eb-1246-4dd8-975c-b10d09439cbd-operator-scripts\") pod \"357c50eb-1246-4dd8-975c-b10d09439cbd\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.175357 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlzz8\" (UniqueName: \"kubernetes.io/projected/357c50eb-1246-4dd8-975c-b10d09439cbd-kube-api-access-wlzz8\") pod \"357c50eb-1246-4dd8-975c-b10d09439cbd\" (UID: \"357c50eb-1246-4dd8-975c-b10d09439cbd\") " Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.175382 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67b98cc-0659-4ebd-a96a-025044731558-operator-scripts\") pod \"d67b98cc-0659-4ebd-a96a-025044731558\" (UID: \"d67b98cc-0659-4ebd-a96a-025044731558\") " Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.175820 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357c50eb-1246-4dd8-975c-b10d09439cbd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "357c50eb-1246-4dd8-975c-b10d09439cbd" (UID: "357c50eb-1246-4dd8-975c-b10d09439cbd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.175954 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/357c50eb-1246-4dd8-975c-b10d09439cbd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.176360 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67b98cc-0659-4ebd-a96a-025044731558-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d67b98cc-0659-4ebd-a96a-025044731558" (UID: "d67b98cc-0659-4ebd-a96a-025044731558"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.185410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357c50eb-1246-4dd8-975c-b10d09439cbd-kube-api-access-wlzz8" (OuterVolumeSpecName: "kube-api-access-wlzz8") pod "357c50eb-1246-4dd8-975c-b10d09439cbd" (UID: "357c50eb-1246-4dd8-975c-b10d09439cbd"). InnerVolumeSpecName "kube-api-access-wlzz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.185525 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67b98cc-0659-4ebd-a96a-025044731558-kube-api-access-wsjgt" (OuterVolumeSpecName: "kube-api-access-wsjgt") pod "d67b98cc-0659-4ebd-a96a-025044731558" (UID: "d67b98cc-0659-4ebd-a96a-025044731558"). InnerVolumeSpecName "kube-api-access-wsjgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.276911 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjgt\" (UniqueName: \"kubernetes.io/projected/d67b98cc-0659-4ebd-a96a-025044731558-kube-api-access-wsjgt\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.277075 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlzz8\" (UniqueName: \"kubernetes.io/projected/357c50eb-1246-4dd8-975c-b10d09439cbd-kube-api-access-wlzz8\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.277127 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67b98cc-0659-4ebd-a96a-025044731558-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.536218 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-hk6d6" event={"ID":"357c50eb-1246-4dd8-975c-b10d09439cbd","Type":"ContainerDied","Data":"376bf0f983c24165f1737a8abc049bd61ef568f18bb3826a169d2c905cf7cb55"} Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.536549 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="376bf0f983c24165f1737a8abc049bd61ef568f18bb3826a169d2c905cf7cb55" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.536706 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-hk6d6" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.539021 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-e17c-account-create-update-4ccmc" event={"ID":"d67b98cc-0659-4ebd-a96a-025044731558","Type":"ContainerDied","Data":"e8c6958939d862ea28c0f21a1d8ba7e761ac273dda2bdefeb7b657d0d56e4fa0"} Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.539075 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8c6958939d862ea28c0f21a1d8ba7e761ac273dda2bdefeb7b657d0d56e4fa0" Jan 27 16:41:00 crc kubenswrapper[4772]: I0127 16:41:00.539158 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-e17c-account-create-update-4ccmc" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.187686 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-66rdf"] Jan 27 16:41:06 crc kubenswrapper[4772]: E0127 16:41:06.188889 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="357c50eb-1246-4dd8-975c-b10d09439cbd" containerName="mariadb-database-create" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.188912 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="357c50eb-1246-4dd8-975c-b10d09439cbd" containerName="mariadb-database-create" Jan 27 16:41:06 crc kubenswrapper[4772]: E0127 16:41:06.188948 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67b98cc-0659-4ebd-a96a-025044731558" containerName="mariadb-account-create-update" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.188960 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67b98cc-0659-4ebd-a96a-025044731558" containerName="mariadb-account-create-update" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.189287 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="357c50eb-1246-4dd8-975c-b10d09439cbd" containerName="mariadb-database-create" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.189320 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67b98cc-0659-4ebd-a96a-025044731558" containerName="mariadb-account-create-update" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.190320 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.193132 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.193133 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-p7qxg" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.195726 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.200978 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-66rdf"] Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.320127 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9bc1548-ca21-4230-a5db-a9321ab69a37-etc-machine-id\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.320251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-db-sync-config-data\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.320307 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-scripts\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.320335 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-combined-ca-bundle\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.320363 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjk7\" (UniqueName: \"kubernetes.io/projected/e9bc1548-ca21-4230-a5db-a9321ab69a37-kube-api-access-dhjk7\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.320386 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-config-data\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421434 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-combined-ca-bundle\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421492 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjk7\" (UniqueName: \"kubernetes.io/projected/e9bc1548-ca21-4230-a5db-a9321ab69a37-kube-api-access-dhjk7\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421515 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-config-data\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421569 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9bc1548-ca21-4230-a5db-a9321ab69a37-etc-machine-id\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-db-sync-config-data\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421661 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-scripts\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.421960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9bc1548-ca21-4230-a5db-a9321ab69a37-etc-machine-id\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.427330 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-scripts\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.427964 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-config-data\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.431641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-db-sync-config-data\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.431913 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-combined-ca-bundle\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.443937 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjk7\" (UniqueName: \"kubernetes.io/projected/e9bc1548-ca21-4230-a5db-a9321ab69a37-kube-api-access-dhjk7\") pod \"cinder-db-sync-66rdf\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:06 crc kubenswrapper[4772]: I0127 16:41:06.519181 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:07 crc kubenswrapper[4772]: I0127 16:41:07.100398 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-66rdf"] Jan 27 16:41:07 crc kubenswrapper[4772]: I0127 16:41:07.639615 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66rdf" event={"ID":"e9bc1548-ca21-4230-a5db-a9321ab69a37","Type":"ContainerStarted","Data":"1da868c60af8c377653c0df4d049fe41db3740b753fe0048ef2e596877f8308c"} Jan 27 16:41:08 crc kubenswrapper[4772]: I0127 16:41:08.651138 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66rdf" event={"ID":"e9bc1548-ca21-4230-a5db-a9321ab69a37","Type":"ContainerStarted","Data":"d0051faf5f33fa9d044b3023d9e4654d63902fac62af135831e6d6e9a248c7b6"} Jan 27 16:41:08 crc kubenswrapper[4772]: I0127 16:41:08.674358 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-66rdf" podStartSLOduration=2.674341098 podStartE2EDuration="2.674341098s" podCreationTimestamp="2026-01-27 16:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:08.667306317 +0000 UTC m=+5654.647915415" watchObservedRunningTime="2026-01-27 16:41:08.674341098 +0000 UTC m=+5654.654950196" Jan 27 16:41:10 crc kubenswrapper[4772]: I0127 16:41:10.675205 4772 generic.go:334] "Generic (PLEG): container finished" podID="e9bc1548-ca21-4230-a5db-a9321ab69a37" containerID="d0051faf5f33fa9d044b3023d9e4654d63902fac62af135831e6d6e9a248c7b6" exitCode=0 Jan 27 16:41:10 crc kubenswrapper[4772]: I0127 16:41:10.686935 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66rdf" event={"ID":"e9bc1548-ca21-4230-a5db-a9321ab69a37","Type":"ContainerDied","Data":"d0051faf5f33fa9d044b3023d9e4654d63902fac62af135831e6d6e9a248c7b6"} Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.058657 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.059233 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.075487 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.147475 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-db-sync-config-data\") pod \"e9bc1548-ca21-4230-a5db-a9321ab69a37\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.147547 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9bc1548-ca21-4230-a5db-a9321ab69a37-etc-machine-id\") pod \"e9bc1548-ca21-4230-a5db-a9321ab69a37\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.147598 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-combined-ca-bundle\") pod \"e9bc1548-ca21-4230-a5db-a9321ab69a37\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.147738 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjk7\" (UniqueName: \"kubernetes.io/projected/e9bc1548-ca21-4230-a5db-a9321ab69a37-kube-api-access-dhjk7\") pod \"e9bc1548-ca21-4230-a5db-a9321ab69a37\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.147780 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-config-data\") pod \"e9bc1548-ca21-4230-a5db-a9321ab69a37\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.147909 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-scripts\") pod \"e9bc1548-ca21-4230-a5db-a9321ab69a37\" (UID: \"e9bc1548-ca21-4230-a5db-a9321ab69a37\") " Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.148411 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9bc1548-ca21-4230-a5db-a9321ab69a37-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e9bc1548-ca21-4230-a5db-a9321ab69a37" (UID: "e9bc1548-ca21-4230-a5db-a9321ab69a37"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.153658 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bc1548-ca21-4230-a5db-a9321ab69a37-kube-api-access-dhjk7" (OuterVolumeSpecName: "kube-api-access-dhjk7") pod "e9bc1548-ca21-4230-a5db-a9321ab69a37" (UID: "e9bc1548-ca21-4230-a5db-a9321ab69a37"). InnerVolumeSpecName "kube-api-access-dhjk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.154256 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-scripts" (OuterVolumeSpecName: "scripts") pod "e9bc1548-ca21-4230-a5db-a9321ab69a37" (UID: "e9bc1548-ca21-4230-a5db-a9321ab69a37"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.156195 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e9bc1548-ca21-4230-a5db-a9321ab69a37" (UID: "e9bc1548-ca21-4230-a5db-a9321ab69a37"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.179460 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9bc1548-ca21-4230-a5db-a9321ab69a37" (UID: "e9bc1548-ca21-4230-a5db-a9321ab69a37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.194729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-config-data" (OuterVolumeSpecName: "config-data") pod "e9bc1548-ca21-4230-a5db-a9321ab69a37" (UID: "e9bc1548-ca21-4230-a5db-a9321ab69a37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.248917 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.248947 4772 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.248958 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e9bc1548-ca21-4230-a5db-a9321ab69a37-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.248966 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.248975 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhjk7\" (UniqueName: \"kubernetes.io/projected/e9bc1548-ca21-4230-a5db-a9321ab69a37-kube-api-access-dhjk7\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.248984 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9bc1548-ca21-4230-a5db-a9321ab69a37-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.702150 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-66rdf" event={"ID":"e9bc1548-ca21-4230-a5db-a9321ab69a37","Type":"ContainerDied","Data":"1da868c60af8c377653c0df4d049fe41db3740b753fe0048ef2e596877f8308c"} Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.702222 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1da868c60af8c377653c0df4d049fe41db3740b753fe0048ef2e596877f8308c" Jan 27 16:41:12 crc kubenswrapper[4772]: I0127 16:41:12.702253 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-66rdf" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.019566 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8896c5c8c-s6z7x"] Jan 27 16:41:13 crc kubenswrapper[4772]: E0127 16:41:13.020027 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bc1548-ca21-4230-a5db-a9321ab69a37" containerName="cinder-db-sync" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.020042 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bc1548-ca21-4230-a5db-a9321ab69a37" containerName="cinder-db-sync" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.023864 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bc1548-ca21-4230-a5db-a9321ab69a37" containerName="cinder-db-sync" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.025428 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.052638 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8896c5c8c-s6z7x"] Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.137869 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.139817 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.146426 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-p7qxg" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.146609 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.146805 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.147284 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.148846 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.164702 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-dns-svc\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.164819 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-ovsdbserver-sb\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.164859 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-config\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.164922 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24z7g\" (UniqueName: \"kubernetes.io/projected/22e08251-8371-4470-bc3e-d88d673d56f3-kube-api-access-24z7g\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.164999 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-ovsdbserver-nb\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.266892 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.266957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data-custom\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-ovsdbserver-nb\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267145 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-dns-svc\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267194 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-logs\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267231 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-scripts\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267473 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-ovsdbserver-sb\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267499 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf7qv\" (UniqueName: \"kubernetes.io/projected/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-kube-api-access-rf7qv\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267521 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-config\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.267565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24z7g\" (UniqueName: \"kubernetes.io/projected/22e08251-8371-4470-bc3e-d88d673d56f3-kube-api-access-24z7g\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.268948 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-ovsdbserver-nb\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.269069 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-dns-svc\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.278783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-config\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.282800 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22e08251-8371-4470-bc3e-d88d673d56f3-ovsdbserver-sb\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.324461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24z7g\" (UniqueName: \"kubernetes.io/projected/22e08251-8371-4470-bc3e-d88d673d56f3-kube-api-access-24z7g\") pod \"dnsmasq-dns-8896c5c8c-s6z7x\" (UID: \"22e08251-8371-4470-bc3e-d88d673d56f3\") " pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370200 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370267 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-logs\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370304 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-scripts\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370331 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf7qv\" (UniqueName: \"kubernetes.io/projected/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-kube-api-access-rf7qv\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370356 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370402 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.370429 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data-custom\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.371475 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-logs\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.372305 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.374737 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data-custom\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.375090 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.375901 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-scripts\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.376176 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.392448 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.410155 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf7qv\" (UniqueName: \"kubernetes.io/projected/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-kube-api-access-rf7qv\") pod \"cinder-api-0\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.464070 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 16:41:13 crc kubenswrapper[4772]: I0127 16:41:13.931659 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8896c5c8c-s6z7x"] Jan 27 16:41:14 crc kubenswrapper[4772]: I0127 16:41:14.097096 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:14 crc kubenswrapper[4772]: I0127 16:41:14.758624 4772 generic.go:334] "Generic (PLEG): container finished" podID="22e08251-8371-4470-bc3e-d88d673d56f3" containerID="f8de1200ab762c5af38758e595e2a2a98ba09782a2b206aac9f09f13a5fccfdb" exitCode=0 Jan 27 16:41:14 crc kubenswrapper[4772]: I0127 16:41:14.759111 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" event={"ID":"22e08251-8371-4470-bc3e-d88d673d56f3","Type":"ContainerDied","Data":"f8de1200ab762c5af38758e595e2a2a98ba09782a2b206aac9f09f13a5fccfdb"} Jan 27 16:41:14 crc kubenswrapper[4772]: I0127 16:41:14.759145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" event={"ID":"22e08251-8371-4470-bc3e-d88d673d56f3","Type":"ContainerStarted","Data":"f53bbe315294f89e556d416ad0831cf8d150079c678fca10d9cd19f4263b92e5"} Jan 27 16:41:14 crc kubenswrapper[4772]: I0127 16:41:14.767657 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f","Type":"ContainerStarted","Data":"714b9b6c5c6438e8d52919984e494ff614fd56b03d77fad32ca28a5a24726e46"} Jan 27 16:41:14 crc kubenswrapper[4772]: I0127 16:41:14.767697 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f","Type":"ContainerStarted","Data":"1985a43f384e22d29818a239c3a060f96131fc47f07c59ec7571d217b8454dc4"} Jan 27 16:41:15 crc kubenswrapper[4772]: I0127 16:41:15.776433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" event={"ID":"22e08251-8371-4470-bc3e-d88d673d56f3","Type":"ContainerStarted","Data":"5a92c845e63b80a27d43fbe97b07377bb4993d6fc150d43118a8c3a2e9f82858"} Jan 27 16:41:15 crc kubenswrapper[4772]: I0127 16:41:15.776657 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:15 crc kubenswrapper[4772]: I0127 16:41:15.780081 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f","Type":"ContainerStarted","Data":"31e3d55d1d113b922a3f84db51ffbe33b9ce1af8297930f881b9d6d32892374f"} Jan 27 16:41:15 crc kubenswrapper[4772]: I0127 16:41:15.780273 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 16:41:15 crc kubenswrapper[4772]: I0127 16:41:15.798947 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" podStartSLOduration=3.7989276050000003 podStartE2EDuration="3.798927605s" podCreationTimestamp="2026-01-27 16:41:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:15.794821838 +0000 UTC m=+5661.775430936" watchObservedRunningTime="2026-01-27 16:41:15.798927605 +0000 UTC m=+5661.779536703" Jan 27 16:41:15 crc kubenswrapper[4772]: I0127 16:41:15.823917 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.823897317 podStartE2EDuration="2.823897317s" podCreationTimestamp="2026-01-27 16:41:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:15.815207689 +0000 UTC m=+5661.795816787" watchObservedRunningTime="2026-01-27 16:41:15.823897317 +0000 UTC m=+5661.804506415" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.530920 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qzhn5"] Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.533990 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.545772 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzhn5"] Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.663344 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-catalog-content\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.663562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-utilities\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.663637 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcwvs\" (UniqueName: \"kubernetes.io/projected/e18a7da1-2037-446c-8646-76917bb9544b-kube-api-access-gcwvs\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.765264 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-utilities\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.765354 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcwvs\" (UniqueName: \"kubernetes.io/projected/e18a7da1-2037-446c-8646-76917bb9544b-kube-api-access-gcwvs\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.765389 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-catalog-content\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.765947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-utilities\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.765979 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-catalog-content\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.785138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcwvs\" (UniqueName: \"kubernetes.io/projected/e18a7da1-2037-446c-8646-76917bb9544b-kube-api-access-gcwvs\") pod \"redhat-marketplace-qzhn5\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:17 crc kubenswrapper[4772]: I0127 16:41:17.853691 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:18 crc kubenswrapper[4772]: I0127 16:41:18.335389 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzhn5"] Jan 27 16:41:18 crc kubenswrapper[4772]: W0127 16:41:18.346296 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode18a7da1_2037_446c_8646_76917bb9544b.slice/crio-54fcdad16f86f21c30f016c1e9b9c8d1af9bb4af36e051b32118c18fdeb26dbf WatchSource:0}: Error finding container 54fcdad16f86f21c30f016c1e9b9c8d1af9bb4af36e051b32118c18fdeb26dbf: Status 404 returned error can't find the container with id 54fcdad16f86f21c30f016c1e9b9c8d1af9bb4af36e051b32118c18fdeb26dbf Jan 27 16:41:18 crc kubenswrapper[4772]: I0127 16:41:18.806103 4772 generic.go:334] "Generic (PLEG): container finished" podID="e18a7da1-2037-446c-8646-76917bb9544b" containerID="5ba8daff8c2db9de85548929fc9125eac49769ff0aed125f736c35420b1b3caf" exitCode=0 Jan 27 16:41:18 crc kubenswrapper[4772]: I0127 16:41:18.806157 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzhn5" event={"ID":"e18a7da1-2037-446c-8646-76917bb9544b","Type":"ContainerDied","Data":"5ba8daff8c2db9de85548929fc9125eac49769ff0aed125f736c35420b1b3caf"} Jan 27 16:41:18 crc kubenswrapper[4772]: I0127 16:41:18.806502 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzhn5" event={"ID":"e18a7da1-2037-446c-8646-76917bb9544b","Type":"ContainerStarted","Data":"54fcdad16f86f21c30f016c1e9b9c8d1af9bb4af36e051b32118c18fdeb26dbf"} Jan 27 16:41:21 crc kubenswrapper[4772]: I0127 16:41:21.840932 4772 generic.go:334] "Generic (PLEG): container finished" podID="e18a7da1-2037-446c-8646-76917bb9544b" containerID="93df129a7d9ca49629bdaa9f5049664a8fdbaa8353cdb2fc27b598bded3cdb48" exitCode=0 Jan 27 16:41:21 crc kubenswrapper[4772]: I0127 16:41:21.840983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzhn5" event={"ID":"e18a7da1-2037-446c-8646-76917bb9544b","Type":"ContainerDied","Data":"93df129a7d9ca49629bdaa9f5049664a8fdbaa8353cdb2fc27b598bded3cdb48"} Jan 27 16:41:22 crc kubenswrapper[4772]: I0127 16:41:22.851547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzhn5" event={"ID":"e18a7da1-2037-446c-8646-76917bb9544b","Type":"ContainerStarted","Data":"2438505e8dbbe6ceb8683042eba5b21297e5463c63d380b7ca6a75115f1e523f"} Jan 27 16:41:22 crc kubenswrapper[4772]: I0127 16:41:22.871836 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qzhn5" podStartSLOduration=2.403662734 podStartE2EDuration="5.87181893s" podCreationTimestamp="2026-01-27 16:41:17 +0000 UTC" firstStartedPulling="2026-01-27 16:41:18.807745778 +0000 UTC m=+5664.788354876" lastFinishedPulling="2026-01-27 16:41:22.275901974 +0000 UTC m=+5668.256511072" observedRunningTime="2026-01-27 16:41:22.867880888 +0000 UTC m=+5668.848489986" watchObservedRunningTime="2026-01-27 16:41:22.87181893 +0000 UTC m=+5668.852428028" Jan 27 16:41:23 crc kubenswrapper[4772]: I0127 16:41:23.376696 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8896c5c8c-s6z7x" Jan 27 16:41:23 crc kubenswrapper[4772]: I0127 16:41:23.437569 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c56cfbf-9f62r"] Jan 27 16:41:23 crc kubenswrapper[4772]: I0127 16:41:23.437871 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="dnsmasq-dns" containerID="cri-o://b740d63c9d1013074313725144a776e8a3b5f1fd4cbe0771f67139ace29cab8f" gracePeriod=10 Jan 27 16:41:23 crc kubenswrapper[4772]: I0127 16:41:23.624439 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.67:5353: connect: connection refused" Jan 27 16:41:23 crc kubenswrapper[4772]: I0127 16:41:23.865123 4772 generic.go:334] "Generic (PLEG): container finished" podID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerID="b740d63c9d1013074313725144a776e8a3b5f1fd4cbe0771f67139ace29cab8f" exitCode=0 Jan 27 16:41:23 crc kubenswrapper[4772]: I0127 16:41:23.865556 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" event={"ID":"487e140f-d3fb-4ece-a41c-7a1c55a37534","Type":"ContainerDied","Data":"b740d63c9d1013074313725144a776e8a3b5f1fd4cbe0771f67139ace29cab8f"} Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.087363 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.180639 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-dns-svc\") pod \"487e140f-d3fb-4ece-a41c-7a1c55a37534\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.180850 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dkhk\" (UniqueName: \"kubernetes.io/projected/487e140f-d3fb-4ece-a41c-7a1c55a37534-kube-api-access-2dkhk\") pod \"487e140f-d3fb-4ece-a41c-7a1c55a37534\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.180939 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-sb\") pod \"487e140f-d3fb-4ece-a41c-7a1c55a37534\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.180988 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-nb\") pod \"487e140f-d3fb-4ece-a41c-7a1c55a37534\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.181025 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-config\") pod \"487e140f-d3fb-4ece-a41c-7a1c55a37534\" (UID: \"487e140f-d3fb-4ece-a41c-7a1c55a37534\") " Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.204574 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487e140f-d3fb-4ece-a41c-7a1c55a37534-kube-api-access-2dkhk" (OuterVolumeSpecName: "kube-api-access-2dkhk") pod "487e140f-d3fb-4ece-a41c-7a1c55a37534" (UID: "487e140f-d3fb-4ece-a41c-7a1c55a37534"). InnerVolumeSpecName "kube-api-access-2dkhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.242131 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "487e140f-d3fb-4ece-a41c-7a1c55a37534" (UID: "487e140f-d3fb-4ece-a41c-7a1c55a37534"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.242576 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-config" (OuterVolumeSpecName: "config") pod "487e140f-d3fb-4ece-a41c-7a1c55a37534" (UID: "487e140f-d3fb-4ece-a41c-7a1c55a37534"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.247277 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "487e140f-d3fb-4ece-a41c-7a1c55a37534" (UID: "487e140f-d3fb-4ece-a41c-7a1c55a37534"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.254602 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "487e140f-d3fb-4ece-a41c-7a1c55a37534" (UID: "487e140f-d3fb-4ece-a41c-7a1c55a37534"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.283711 4772 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.283763 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dkhk\" (UniqueName: \"kubernetes.io/projected/487e140f-d3fb-4ece-a41c-7a1c55a37534-kube-api-access-2dkhk\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.283778 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.283791 4772 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.283806 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487e140f-d3fb-4ece-a41c-7a1c55a37534-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.876121 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" event={"ID":"487e140f-d3fb-4ece-a41c-7a1c55a37534","Type":"ContainerDied","Data":"e6f437da52e8342f357b6a59971f74fc9403dcd77f6fa7a76fcf902ca6dd800f"} Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.876522 4772 scope.go:117] "RemoveContainer" containerID="b740d63c9d1013074313725144a776e8a3b5f1fd4cbe0771f67139ace29cab8f" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.876214 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c56cfbf-9f62r" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.904570 4772 scope.go:117] "RemoveContainer" containerID="3b284b2b9feaaf7731a5ba79953b14bcd6f7aec90b9b76040821cf8f6a1cb2f9" Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.909421 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c56cfbf-9f62r"] Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.922305 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c56cfbf-9f62r"] Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.952181 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:41:24 crc kubenswrapper[4772]: I0127 16:41:24.952448 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" containerName="nova-cell0-conductor-conductor" containerID="cri-o://43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.021549 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.021835 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-log" containerID="cri-o://193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.022388 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-metadata" containerID="cri-o://b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.049266 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.049535 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-log" containerID="cri-o://8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.049943 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-api" containerID="cri-o://67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.071505 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.071750 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ce725b6a-06bb-4339-819f-fee8819078f0" containerName="nova-scheduler-scheduler" containerID="cri-o://33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.085729 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.086638 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a30c12eb-4e83-420a-8064-859689d91d2d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://fba71bd5b1f04f4ce21ecca46c2028b0aeffaa1f208e5b17978c2d0f906cbf36" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.094305 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.094690 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e1c55305-7d18-44cb-90e5-b6793989abda" containerName="nova-cell1-conductor-conductor" containerID="cri-o://a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81" gracePeriod=30 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.567897 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.889018 4772 generic.go:334] "Generic (PLEG): container finished" podID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerID="8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0" exitCode=143 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.889199 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a518103b-f26e-4f91-9ca9-93f1f8d5e113","Type":"ContainerDied","Data":"8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0"} Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.891981 4772 generic.go:334] "Generic (PLEG): container finished" podID="a30c12eb-4e83-420a-8064-859689d91d2d" containerID="fba71bd5b1f04f4ce21ecca46c2028b0aeffaa1f208e5b17978c2d0f906cbf36" exitCode=0 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.892039 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a30c12eb-4e83-420a-8064-859689d91d2d","Type":"ContainerDied","Data":"fba71bd5b1f04f4ce21ecca46c2028b0aeffaa1f208e5b17978c2d0f906cbf36"} Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.892061 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a30c12eb-4e83-420a-8064-859689d91d2d","Type":"ContainerDied","Data":"772c077f79e9fa947b145341393fdb1b1ddc0a4d4b6ec55b124c53c8a8bb5534"} Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.892076 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="772c077f79e9fa947b145341393fdb1b1ddc0a4d4b6ec55b124c53c8a8bb5534" Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.898406 4772 generic.go:334] "Generic (PLEG): container finished" podID="8c652921-712d-46ee-9683-fd6312e33d1e" containerID="193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4" exitCode=143 Jan 27 16:41:25 crc kubenswrapper[4772]: I0127 16:41:25.898442 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c652921-712d-46ee-9683-fd6312e33d1e","Type":"ContainerDied","Data":"193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4"} Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.012492 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.143672 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22khv\" (UniqueName: \"kubernetes.io/projected/a30c12eb-4e83-420a-8064-859689d91d2d-kube-api-access-22khv\") pod \"a30c12eb-4e83-420a-8064-859689d91d2d\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.143785 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-config-data\") pod \"a30c12eb-4e83-420a-8064-859689d91d2d\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.143878 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-combined-ca-bundle\") pod \"a30c12eb-4e83-420a-8064-859689d91d2d\" (UID: \"a30c12eb-4e83-420a-8064-859689d91d2d\") " Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.149629 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a30c12eb-4e83-420a-8064-859689d91d2d-kube-api-access-22khv" (OuterVolumeSpecName: "kube-api-access-22khv") pod "a30c12eb-4e83-420a-8064-859689d91d2d" (UID: "a30c12eb-4e83-420a-8064-859689d91d2d"). InnerVolumeSpecName "kube-api-access-22khv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.173232 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a30c12eb-4e83-420a-8064-859689d91d2d" (UID: "a30c12eb-4e83-420a-8064-859689d91d2d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.194568 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-config-data" (OuterVolumeSpecName: "config-data") pod "a30c12eb-4e83-420a-8064-859689d91d2d" (UID: "a30c12eb-4e83-420a-8064-859689d91d2d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.245555 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.245762 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22khv\" (UniqueName: \"kubernetes.io/projected/a30c12eb-4e83-420a-8064-859689d91d2d-kube-api-access-22khv\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.245831 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c12eb-4e83-420a-8064-859689d91d2d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.674180 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" path="/var/lib/kubelet/pods/487e140f-d3fb-4ece-a41c-7a1c55a37534/volumes" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.907699 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.932546 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.957236 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.967492 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:41:26 crc kubenswrapper[4772]: E0127 16:41:26.968107 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="dnsmasq-dns" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.968209 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="dnsmasq-dns" Jan 27 16:41:26 crc kubenswrapper[4772]: E0127 16:41:26.968284 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="init" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.968334 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="init" Jan 27 16:41:26 crc kubenswrapper[4772]: E0127 16:41:26.968423 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a30c12eb-4e83-420a-8064-859689d91d2d" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.968473 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a30c12eb-4e83-420a-8064-859689d91d2d" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.968695 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a30c12eb-4e83-420a-8064-859689d91d2d" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.968788 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="487e140f-d3fb-4ece-a41c-7a1c55a37534" containerName="dnsmasq-dns" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.969449 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.974351 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 16:41:26 crc kubenswrapper[4772]: I0127 16:41:26.989975 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.160874 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a7ce2-c175-4632-abe5-f35a6b5ce680-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.161119 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0a7ce2-c175-4632-abe5-f35a6b5ce680-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.161243 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl8m4\" (UniqueName: \"kubernetes.io/projected/db0a7ce2-c175-4632-abe5-f35a6b5ce680-kube-api-access-bl8m4\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.263098 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0a7ce2-c175-4632-abe5-f35a6b5ce680-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.263211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl8m4\" (UniqueName: \"kubernetes.io/projected/db0a7ce2-c175-4632-abe5-f35a6b5ce680-kube-api-access-bl8m4\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.263305 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a7ce2-c175-4632-abe5-f35a6b5ce680-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.276375 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db0a7ce2-c175-4632-abe5-f35a6b5ce680-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.280020 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a7ce2-c175-4632-abe5-f35a6b5ce680-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.280468 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl8m4\" (UniqueName: \"kubernetes.io/projected/db0a7ce2-c175-4632-abe5-f35a6b5ce680-kube-api-access-bl8m4\") pod \"nova-cell1-novncproxy-0\" (UID: \"db0a7ce2-c175-4632-abe5-f35a6b5ce680\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.299640 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.766941 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.854942 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.854985 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.915880 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.922590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db0a7ce2-c175-4632-abe5-f35a6b5ce680","Type":"ContainerStarted","Data":"cc983685cb7f18229c2e45eee47ef7546e8b63e99119756d8a92f563e3b11e6e"} Jan 27 16:41:27 crc kubenswrapper[4772]: I0127 16:41:27.971220 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:28 crc kubenswrapper[4772]: E0127 16:41:28.080235 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 16:41:28 crc kubenswrapper[4772]: E0127 16:41:28.082354 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 16:41:28 crc kubenswrapper[4772]: E0127 16:41:28.084992 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 16:41:28 crc kubenswrapper[4772]: E0127 16:41:28.085048 4772 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ce725b6a-06bb-4339-819f-fee8819078f0" containerName="nova-scheduler-scheduler" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.153808 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.64:8775/\": read tcp 10.217.0.2:34972->10.217.1.64:8775: read: connection reset by peer" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.154201 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzhn5"] Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.155229 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.64:8775/\": read tcp 10.217.0.2:34970->10.217.1.64:8775: read: connection reset by peer" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.494204 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.586915 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-combined-ca-bundle\") pod \"e1c55305-7d18-44cb-90e5-b6793989abda\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.587142 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-config-data\") pod \"e1c55305-7d18-44cb-90e5-b6793989abda\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.587186 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdhmd\" (UniqueName: \"kubernetes.io/projected/e1c55305-7d18-44cb-90e5-b6793989abda-kube-api-access-vdhmd\") pod \"e1c55305-7d18-44cb-90e5-b6793989abda\" (UID: \"e1c55305-7d18-44cb-90e5-b6793989abda\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.596388 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c55305-7d18-44cb-90e5-b6793989abda-kube-api-access-vdhmd" (OuterVolumeSpecName: "kube-api-access-vdhmd") pod "e1c55305-7d18-44cb-90e5-b6793989abda" (UID: "e1c55305-7d18-44cb-90e5-b6793989abda"). InnerVolumeSpecName "kube-api-access-vdhmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.641973 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-config-data" (OuterVolumeSpecName: "config-data") pod "e1c55305-7d18-44cb-90e5-b6793989abda" (UID: "e1c55305-7d18-44cb-90e5-b6793989abda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.668371 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1c55305-7d18-44cb-90e5-b6793989abda" (UID: "e1c55305-7d18-44cb-90e5-b6793989abda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.691346 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.691371 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdhmd\" (UniqueName: \"kubernetes.io/projected/e1c55305-7d18-44cb-90e5-b6793989abda-kube-api-access-vdhmd\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.691380 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1c55305-7d18-44cb-90e5-b6793989abda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.696934 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a30c12eb-4e83-420a-8064-859689d91d2d" path="/var/lib/kubelet/pods/a30c12eb-4e83-420a-8064-859689d91d2d/volumes" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.713121 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.787285 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.793040 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cfjb\" (UniqueName: \"kubernetes.io/projected/8c652921-712d-46ee-9683-fd6312e33d1e-kube-api-access-7cfjb\") pod \"8c652921-712d-46ee-9683-fd6312e33d1e\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.793602 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-config-data\") pod \"8c652921-712d-46ee-9683-fd6312e33d1e\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.793798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c652921-712d-46ee-9683-fd6312e33d1e-logs\") pod \"8c652921-712d-46ee-9683-fd6312e33d1e\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.793834 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-combined-ca-bundle\") pod \"8c652921-712d-46ee-9683-fd6312e33d1e\" (UID: \"8c652921-712d-46ee-9683-fd6312e33d1e\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.796060 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c652921-712d-46ee-9683-fd6312e33d1e-logs" (OuterVolumeSpecName: "logs") pod "8c652921-712d-46ee-9683-fd6312e33d1e" (UID: "8c652921-712d-46ee-9683-fd6312e33d1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.799362 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c652921-712d-46ee-9683-fd6312e33d1e-kube-api-access-7cfjb" (OuterVolumeSpecName: "kube-api-access-7cfjb") pod "8c652921-712d-46ee-9683-fd6312e33d1e" (UID: "8c652921-712d-46ee-9683-fd6312e33d1e"). InnerVolumeSpecName "kube-api-access-7cfjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.824688 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c652921-712d-46ee-9683-fd6312e33d1e" (UID: "8c652921-712d-46ee-9683-fd6312e33d1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.836940 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-config-data" (OuterVolumeSpecName: "config-data") pod "8c652921-712d-46ee-9683-fd6312e33d1e" (UID: "8c652921-712d-46ee-9683-fd6312e33d1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.870050 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897148 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-combined-ca-bundle\") pod \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897224 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-config-data\") pod \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897267 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a518103b-f26e-4f91-9ca9-93f1f8d5e113-logs\") pod \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897291 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vg99\" (UniqueName: \"kubernetes.io/projected/a518103b-f26e-4f91-9ca9-93f1f8d5e113-kube-api-access-8vg99\") pod \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\" (UID: \"a518103b-f26e-4f91-9ca9-93f1f8d5e113\") " Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897660 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cfjb\" (UniqueName: \"kubernetes.io/projected/8c652921-712d-46ee-9683-fd6312e33d1e-kube-api-access-7cfjb\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897657 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a518103b-f26e-4f91-9ca9-93f1f8d5e113-logs" (OuterVolumeSpecName: "logs") pod "a518103b-f26e-4f91-9ca9-93f1f8d5e113" (UID: "a518103b-f26e-4f91-9ca9-93f1f8d5e113"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897676 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897724 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c652921-712d-46ee-9683-fd6312e33d1e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.897738 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c652921-712d-46ee-9683-fd6312e33d1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.904495 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a518103b-f26e-4f91-9ca9-93f1f8d5e113-kube-api-access-8vg99" (OuterVolumeSpecName: "kube-api-access-8vg99") pod "a518103b-f26e-4f91-9ca9-93f1f8d5e113" (UID: "a518103b-f26e-4f91-9ca9-93f1f8d5e113"). InnerVolumeSpecName "kube-api-access-8vg99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.930341 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-config-data" (OuterVolumeSpecName: "config-data") pod "a518103b-f26e-4f91-9ca9-93f1f8d5e113" (UID: "a518103b-f26e-4f91-9ca9-93f1f8d5e113"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.935063 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a518103b-f26e-4f91-9ca9-93f1f8d5e113" (UID: "a518103b-f26e-4f91-9ca9-93f1f8d5e113"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.936282 4772 generic.go:334] "Generic (PLEG): container finished" podID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerID="67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075" exitCode=0 Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.936346 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a518103b-f26e-4f91-9ca9-93f1f8d5e113","Type":"ContainerDied","Data":"67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.936371 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a518103b-f26e-4f91-9ca9-93f1f8d5e113","Type":"ContainerDied","Data":"ac2ae0550771f81842cdb596165399767b9e27c96df8f6a9cc1d4cc43f6c15e1"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.936389 4772 scope.go:117] "RemoveContainer" containerID="67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.936432 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.938693 4772 generic.go:334] "Generic (PLEG): container finished" podID="4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" containerID="43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50" exitCode=0 Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.938772 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e","Type":"ContainerDied","Data":"43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.938820 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e","Type":"ContainerDied","Data":"30ded49d3176ba704104aa049f568d3100eaac27bd25914a3c0f0f06ce633e73"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.938894 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.950733 4772 generic.go:334] "Generic (PLEG): container finished" podID="8c652921-712d-46ee-9683-fd6312e33d1e" containerID="b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193" exitCode=0 Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.950836 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.950833 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c652921-712d-46ee-9683-fd6312e33d1e","Type":"ContainerDied","Data":"b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.950972 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8c652921-712d-46ee-9683-fd6312e33d1e","Type":"ContainerDied","Data":"8059f8ca79e1f853876a311e130550fa4d5e0bfbe87838df84c9d7f7b757e1d9"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.954959 4772 generic.go:334] "Generic (PLEG): container finished" podID="e1c55305-7d18-44cb-90e5-b6793989abda" containerID="a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81" exitCode=0 Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.955126 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.956001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e1c55305-7d18-44cb-90e5-b6793989abda","Type":"ContainerDied","Data":"a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.956031 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e1c55305-7d18-44cb-90e5-b6793989abda","Type":"ContainerDied","Data":"8f15691a47c277f39b1b0ae53dfcc7f92ebe7de7d545b4f191aa38d15da7fb88"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.962798 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db0a7ce2-c175-4632-abe5-f35a6b5ce680","Type":"ContainerStarted","Data":"7311b0a3ed84e943d3cdbe87e7d103f916c68b9c89bff3b2198ec268ff4f96f3"} Jan 27 16:41:28 crc kubenswrapper[4772]: I0127 16:41:28.995054 4772 scope.go:117] "RemoveContainer" containerID="8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.014662 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-combined-ca-bundle\") pod \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.015063 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7l8\" (UniqueName: \"kubernetes.io/projected/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-kube-api-access-2j7l8\") pod \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.015217 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-config-data\") pod \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\" (UID: \"4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e\") " Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.015826 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.015844 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a518103b-f26e-4f91-9ca9-93f1f8d5e113-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.015855 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a518103b-f26e-4f91-9ca9-93f1f8d5e113-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.015866 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vg99\" (UniqueName: \"kubernetes.io/projected/a518103b-f26e-4f91-9ca9-93f1f8d5e113-kube-api-access-8vg99\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.026099 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-kube-api-access-2j7l8" (OuterVolumeSpecName: "kube-api-access-2j7l8") pod "4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" (UID: "4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e"). InnerVolumeSpecName "kube-api-access-2j7l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.058354 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.063159 4772 scope.go:117] "RemoveContainer" containerID="67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.064055 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075\": container with ID starting with 67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075 not found: ID does not exist" containerID="67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.064084 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075"} err="failed to get container status \"67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075\": rpc error: code = NotFound desc = could not find container \"67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075\": container with ID starting with 67369ccebf4181eb0be2f61ead1ad9a24b9c3e7147fed849be8e7d8b2afab075 not found: ID does not exist" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.064108 4772 scope.go:117] "RemoveContainer" containerID="8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.064341 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0\": container with ID starting with 8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0 not found: ID does not exist" containerID="8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.064356 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0"} err="failed to get container status \"8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0\": rpc error: code = NotFound desc = could not find container \"8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0\": container with ID starting with 8f94df1da4ef644e3b2171358f7c0dcee05549e2b5ef22dd8252e4943b46f5c0 not found: ID does not exist" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.064367 4772 scope.go:117] "RemoveContainer" containerID="43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.066088 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" (UID: "4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.097291 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.102275 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-config-data" (OuterVolumeSpecName: "config-data") pod "4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" (UID: "4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.119385 4772 scope.go:117] "RemoveContainer" containerID="43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.120391 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50\": container with ID starting with 43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50 not found: ID does not exist" containerID="43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.120445 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50"} err="failed to get container status \"43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50\": rpc error: code = NotFound desc = could not find container \"43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50\": container with ID starting with 43ecbee0f6dc0baf4a170605cf33f5e90020f6dd199b406980e9515061715e50 not found: ID does not exist" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.120476 4772 scope.go:117] "RemoveContainer" containerID="b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.120539 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7l8\" (UniqueName: \"kubernetes.io/projected/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-kube-api-access-2j7l8\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.120556 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.120566 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124236 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.124682 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c55305-7d18-44cb-90e5-b6793989abda" containerName="nova-cell1-conductor-conductor" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124700 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c55305-7d18-44cb-90e5-b6793989abda" containerName="nova-cell1-conductor-conductor" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.124715 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" containerName="nova-cell0-conductor-conductor" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124721 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" containerName="nova-cell0-conductor-conductor" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.124730 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-api" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124739 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-api" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.124756 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-log" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124762 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-log" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.124778 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-metadata" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124784 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-metadata" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.124795 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-log" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124801 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-log" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124953 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-log" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124970 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-log" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.124993 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" containerName="nova-api-api" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.125006 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c55305-7d18-44cb-90e5-b6793989abda" containerName="nova-cell1-conductor-conductor" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.125016 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" containerName="nova-cell0-conductor-conductor" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.125024 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" containerName="nova-metadata-metadata" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.126003 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.129161 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.158656 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.166280 4772 scope.go:117] "RemoveContainer" containerID="193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.170609 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.170587188 podStartE2EDuration="3.170587188s" podCreationTimestamp="2026-01-27 16:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:29.010932598 +0000 UTC m=+5674.991541706" watchObservedRunningTime="2026-01-27 16:41:29.170587188 +0000 UTC m=+5675.151196286" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.194574 4772 scope.go:117] "RemoveContainer" containerID="b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.195234 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193\": container with ID starting with b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193 not found: ID does not exist" containerID="b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.195324 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193"} err="failed to get container status \"b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193\": rpc error: code = NotFound desc = could not find container \"b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193\": container with ID starting with b7973b4ea385a2bd5b179204a29b45f97b045488c54fbb72800c5104d45dc193 not found: ID does not exist" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.195348 4772 scope.go:117] "RemoveContainer" containerID="193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4" Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.196206 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4\": container with ID starting with 193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4 not found: ID does not exist" containerID="193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.196241 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4"} err="failed to get container status \"193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4\": rpc error: code = NotFound desc = could not find container \"193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4\": container with ID starting with 193d54411bb1309c9287f87812da125aae0f9213050e8e77849e515da35b0fc4 not found: ID does not exist" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.196262 4772 scope.go:117] "RemoveContainer" containerID="a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.196584 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.208854 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.218544 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.219923 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.221720 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f178d7e3-af69-4014-8209-5e766a130997-logs\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.221779 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f178d7e3-af69-4014-8209-5e766a130997-config-data\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.221807 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qptf4\" (UniqueName: \"kubernetes.io/projected/f178d7e3-af69-4014-8209-5e766a130997-kube-api-access-qptf4\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.221834 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f178d7e3-af69-4014-8209-5e766a130997-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.222414 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.228654 4772 scope.go:117] "RemoveContainer" containerID="a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.229713 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.230991 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81\": container with ID starting with a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81 not found: ID does not exist" containerID="a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.231021 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81"} err="failed to get container status \"a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81\": rpc error: code = NotFound desc = could not find container \"a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81\": container with ID starting with a2db576fbfc7eed07b697490f956015f0fceaa99ec44f3681ae02d7525bd0c81 not found: ID does not exist" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.242669 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.251920 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: E0127 16:41:29.253434 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c652921_712d_46ee_9683_fd6312e33d1e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c652921_712d_46ee_9683_fd6312e33d1e.slice/crio-8059f8ca79e1f853876a311e130550fa4d5e0bfbe87838df84c9d7f7b757e1d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda518103b_f26e_4f91_9ca9_93f1f8d5e113.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda518103b_f26e_4f91_9ca9_93f1f8d5e113.slice/crio-ac2ae0550771f81842cdb596165399767b9e27c96df8f6a9cc1d4cc43f6c15e1\": RecentStats: unable to find data in memory cache]" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.291107 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.315380 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.315509 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.318538 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324555 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qptf4\" (UniqueName: \"kubernetes.io/projected/f178d7e3-af69-4014-8209-5e766a130997-kube-api-access-qptf4\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324618 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f178d7e3-af69-4014-8209-5e766a130997-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324685 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549c89c-f55f-484d-80b2-ca1ad19bf758-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324707 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549c89c-f55f-484d-80b2-ca1ad19bf758-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324762 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f178d7e3-af69-4014-8209-5e766a130997-logs\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f178d7e3-af69-4014-8209-5e766a130997-config-data\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.324841 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqfs\" (UniqueName: \"kubernetes.io/projected/9549c89c-f55f-484d-80b2-ca1ad19bf758-kube-api-access-pgqfs\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.326146 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f178d7e3-af69-4014-8209-5e766a130997-logs\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.341843 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f178d7e3-af69-4014-8209-5e766a130997-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.343655 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f178d7e3-af69-4014-8209-5e766a130997-config-data\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.349823 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qptf4\" (UniqueName: \"kubernetes.io/projected/f178d7e3-af69-4014-8209-5e766a130997-kube-api-access-qptf4\") pod \"nova-api-0\" (UID: \"f178d7e3-af69-4014-8209-5e766a130997\") " pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.374359 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.388791 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.404580 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.406008 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.408664 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.414559 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426290 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a983cf0-2c51-4d6a-af53-f115f3a57360-config-data\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426423 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-427mh\" (UniqueName: \"kubernetes.io/projected/3a983cf0-2c51-4d6a-af53-f115f3a57360-kube-api-access-427mh\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426496 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqfs\" (UniqueName: \"kubernetes.io/projected/9549c89c-f55f-484d-80b2-ca1ad19bf758-kube-api-access-pgqfs\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a983cf0-2c51-4d6a-af53-f115f3a57360-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426621 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549c89c-f55f-484d-80b2-ca1ad19bf758-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549c89c-f55f-484d-80b2-ca1ad19bf758-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.426711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a983cf0-2c51-4d6a-af53-f115f3a57360-logs\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.431104 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9549c89c-f55f-484d-80b2-ca1ad19bf758-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.437972 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9549c89c-f55f-484d-80b2-ca1ad19bf758-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.441954 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqfs\" (UniqueName: \"kubernetes.io/projected/9549c89c-f55f-484d-80b2-ca1ad19bf758-kube-api-access-pgqfs\") pod \"nova-cell1-conductor-0\" (UID: \"9549c89c-f55f-484d-80b2-ca1ad19bf758\") " pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.469282 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a983cf0-2c51-4d6a-af53-f115f3a57360-config-data\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528775 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-427mh\" (UniqueName: \"kubernetes.io/projected/3a983cf0-2c51-4d6a-af53-f115f3a57360-kube-api-access-427mh\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528818 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp92q\" (UniqueName: \"kubernetes.io/projected/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-kube-api-access-fp92q\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528852 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528905 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a983cf0-2c51-4d6a-af53-f115f3a57360-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528948 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.528981 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a983cf0-2c51-4d6a-af53-f115f3a57360-logs\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.531001 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a983cf0-2c51-4d6a-af53-f115f3a57360-logs\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.532506 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a983cf0-2c51-4d6a-af53-f115f3a57360-config-data\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.539804 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a983cf0-2c51-4d6a-af53-f115f3a57360-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.548108 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.554410 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-427mh\" (UniqueName: \"kubernetes.io/projected/3a983cf0-2c51-4d6a-af53-f115f3a57360-kube-api-access-427mh\") pod \"nova-metadata-0\" (UID: \"3a983cf0-2c51-4d6a-af53-f115f3a57360\") " pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.631220 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.631381 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp92q\" (UniqueName: \"kubernetes.io/projected/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-kube-api-access-fp92q\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.631430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.637375 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.641845 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.658252 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp92q\" (UniqueName: \"kubernetes.io/projected/3331c1dd-ff2d-4a41-9cb3-731297ae0dc3-kube-api-access-fp92q\") pod \"nova-cell0-conductor-0\" (UID: \"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3\") " pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.727674 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.849327 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:29 crc kubenswrapper[4772]: I0127 16:41:29.945964 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.009799 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qzhn5" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="registry-server" containerID="cri-o://2438505e8dbbe6ceb8683042eba5b21297e5463c63d380b7ca6a75115f1e523f" gracePeriod=2 Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.090614 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.244144 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 16:41:30 crc kubenswrapper[4772]: W0127 16:41:30.253053 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a983cf0_2c51_4d6a_af53_f115f3a57360.slice/crio-d4be5456d48858af909844028917bca2da1295787038c08194bdc6f50d2b0ce3 WatchSource:0}: Error finding container d4be5456d48858af909844028917bca2da1295787038c08194bdc6f50d2b0ce3: Status 404 returned error can't find the container with id d4be5456d48858af909844028917bca2da1295787038c08194bdc6f50d2b0ce3 Jan 27 16:41:30 crc kubenswrapper[4772]: W0127 16:41:30.406262 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3331c1dd_ff2d_4a41_9cb3_731297ae0dc3.slice/crio-4487a1101aa85dfdee48a3a3947a7279c9954c304d0cf3715f872609665629fc WatchSource:0}: Error finding container 4487a1101aa85dfdee48a3a3947a7279c9954c304d0cf3715f872609665629fc: Status 404 returned error can't find the container with id 4487a1101aa85dfdee48a3a3947a7279c9954c304d0cf3715f872609665629fc Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.409782 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.675504 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e" path="/var/lib/kubelet/pods/4b851a7f-5a0c-41fd-9c4a-a5e30c8d389e/volumes" Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.676307 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c652921-712d-46ee-9683-fd6312e33d1e" path="/var/lib/kubelet/pods/8c652921-712d-46ee-9683-fd6312e33d1e/volumes" Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.677065 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a518103b-f26e-4f91-9ca9-93f1f8d5e113" path="/var/lib/kubelet/pods/a518103b-f26e-4f91-9ca9-93f1f8d5e113/volumes" Jan 27 16:41:30 crc kubenswrapper[4772]: I0127 16:41:30.678434 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c55305-7d18-44cb-90e5-b6793989abda" path="/var/lib/kubelet/pods/e1c55305-7d18-44cb-90e5-b6793989abda/volumes" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.018804 4772 generic.go:334] "Generic (PLEG): container finished" podID="e18a7da1-2037-446c-8646-76917bb9544b" containerID="2438505e8dbbe6ceb8683042eba5b21297e5463c63d380b7ca6a75115f1e523f" exitCode=0 Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.018887 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzhn5" event={"ID":"e18a7da1-2037-446c-8646-76917bb9544b","Type":"ContainerDied","Data":"2438505e8dbbe6ceb8683042eba5b21297e5463c63d380b7ca6a75115f1e523f"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.022027 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a983cf0-2c51-4d6a-af53-f115f3a57360","Type":"ContainerStarted","Data":"64d6d17b19ec020b3dfb6e4355e68bf1830f515f1e0ec8a1112c49d0c4efca66"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.022068 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a983cf0-2c51-4d6a-af53-f115f3a57360","Type":"ContainerStarted","Data":"d4be5456d48858af909844028917bca2da1295787038c08194bdc6f50d2b0ce3"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.023151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9549c89c-f55f-484d-80b2-ca1ad19bf758","Type":"ContainerStarted","Data":"b3bbbec2af58eb256af98ed83442ff28dc180942162c5cd5afdf54e6d460aa0e"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.023287 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9549c89c-f55f-484d-80b2-ca1ad19bf758","Type":"ContainerStarted","Data":"4b21d0ea7a82793879133c00776797657ee4dd5c87a48f04228ecfaaf39dbb27"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.024745 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.029199 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3","Type":"ContainerStarted","Data":"82f4585169c63c6ceec89dc5a4784bbd9d685d5832c552d6bd7983d785cc412c"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.029448 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3331c1dd-ff2d-4a41-9cb3-731297ae0dc3","Type":"ContainerStarted","Data":"4487a1101aa85dfdee48a3a3947a7279c9954c304d0cf3715f872609665629fc"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.030398 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.034917 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f178d7e3-af69-4014-8209-5e766a130997","Type":"ContainerStarted","Data":"7a03bd9d50fe00835e95690a3cd780a1c18122b457f26c56197834d3528434c9"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.035099 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f178d7e3-af69-4014-8209-5e766a130997","Type":"ContainerStarted","Data":"6434c4d76d62b8e8f47a32fd939db880888344c2fda956f49ede17e91fe6bfd0"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.035191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f178d7e3-af69-4014-8209-5e766a130997","Type":"ContainerStarted","Data":"e41ed01429c113075baaa97bbdf723baf1b7aa1af73c8d2579134878b9e4b420"} Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.039761 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.085590 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.085557303 podStartE2EDuration="2.085557303s" podCreationTimestamp="2026-01-27 16:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:31.050047821 +0000 UTC m=+5677.030656919" watchObservedRunningTime="2026-01-27 16:41:31.085557303 +0000 UTC m=+5677.066166391" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.136636 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.136613818 podStartE2EDuration="2.136613818s" podCreationTimestamp="2026-01-27 16:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:31.125425089 +0000 UTC m=+5677.106034197" watchObservedRunningTime="2026-01-27 16:41:31.136613818 +0000 UTC m=+5677.117222916" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.149946 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.149921668 podStartE2EDuration="3.149921668s" podCreationTimestamp="2026-01-27 16:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:31.143180035 +0000 UTC m=+5677.123789153" watchObservedRunningTime="2026-01-27 16:41:31.149921668 +0000 UTC m=+5677.130530766" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.171118 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-catalog-content\") pod \"e18a7da1-2037-446c-8646-76917bb9544b\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.171554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcwvs\" (UniqueName: \"kubernetes.io/projected/e18a7da1-2037-446c-8646-76917bb9544b-kube-api-access-gcwvs\") pod \"e18a7da1-2037-446c-8646-76917bb9544b\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.171681 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-utilities\") pod \"e18a7da1-2037-446c-8646-76917bb9544b\" (UID: \"e18a7da1-2037-446c-8646-76917bb9544b\") " Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.172444 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-utilities" (OuterVolumeSpecName: "utilities") pod "e18a7da1-2037-446c-8646-76917bb9544b" (UID: "e18a7da1-2037-446c-8646-76917bb9544b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.173316 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.176265 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18a7da1-2037-446c-8646-76917bb9544b-kube-api-access-gcwvs" (OuterVolumeSpecName: "kube-api-access-gcwvs") pod "e18a7da1-2037-446c-8646-76917bb9544b" (UID: "e18a7da1-2037-446c-8646-76917bb9544b"). InnerVolumeSpecName "kube-api-access-gcwvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.206153 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e18a7da1-2037-446c-8646-76917bb9544b" (UID: "e18a7da1-2037-446c-8646-76917bb9544b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.274711 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18a7da1-2037-446c-8646-76917bb9544b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:31 crc kubenswrapper[4772]: I0127 16:41:31.274755 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcwvs\" (UniqueName: \"kubernetes.io/projected/e18a7da1-2037-446c-8646-76917bb9544b-kube-api-access-gcwvs\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.056338 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qzhn5" event={"ID":"e18a7da1-2037-446c-8646-76917bb9544b","Type":"ContainerDied","Data":"54fcdad16f86f21c30f016c1e9b9c8d1af9bb4af36e051b32118c18fdeb26dbf"} Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.057872 4772 scope.go:117] "RemoveContainer" containerID="2438505e8dbbe6ceb8683042eba5b21297e5463c63d380b7ca6a75115f1e523f" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.058146 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qzhn5" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.082778 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3a983cf0-2c51-4d6a-af53-f115f3a57360","Type":"ContainerStarted","Data":"c609b64efb90cd41df1661bc2a9ccff091d7aba93a74339f5dc29950cc106710"} Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.085272 4772 generic.go:334] "Generic (PLEG): container finished" podID="ce725b6a-06bb-4339-819f-fee8819078f0" containerID="33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c" exitCode=0 Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.085390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce725b6a-06bb-4339-819f-fee8819078f0","Type":"ContainerDied","Data":"33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c"} Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.120147 4772 scope.go:117] "RemoveContainer" containerID="93df129a7d9ca49629bdaa9f5049664a8fdbaa8353cdb2fc27b598bded3cdb48" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.130034 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.129998193 podStartE2EDuration="3.129998193s" podCreationTimestamp="2026-01-27 16:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:32.117932529 +0000 UTC m=+5678.098541617" watchObservedRunningTime="2026-01-27 16:41:32.129998193 +0000 UTC m=+5678.110607291" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.141109 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzhn5"] Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.149144 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qzhn5"] Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.161442 4772 scope.go:117] "RemoveContainer" containerID="5ba8daff8c2db9de85548929fc9125eac49769ff0aed125f736c35420b1b3caf" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.300255 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.564932 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.674665 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18a7da1-2037-446c-8646-76917bb9544b" path="/var/lib/kubelet/pods/e18a7da1-2037-446c-8646-76917bb9544b/volumes" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.710328 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ckmf\" (UniqueName: \"kubernetes.io/projected/ce725b6a-06bb-4339-819f-fee8819078f0-kube-api-access-7ckmf\") pod \"ce725b6a-06bb-4339-819f-fee8819078f0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.710552 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-combined-ca-bundle\") pod \"ce725b6a-06bb-4339-819f-fee8819078f0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.710604 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-config-data\") pod \"ce725b6a-06bb-4339-819f-fee8819078f0\" (UID: \"ce725b6a-06bb-4339-819f-fee8819078f0\") " Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.728929 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce725b6a-06bb-4339-819f-fee8819078f0-kube-api-access-7ckmf" (OuterVolumeSpecName: "kube-api-access-7ckmf") pod "ce725b6a-06bb-4339-819f-fee8819078f0" (UID: "ce725b6a-06bb-4339-819f-fee8819078f0"). InnerVolumeSpecName "kube-api-access-7ckmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.735005 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-config-data" (OuterVolumeSpecName: "config-data") pod "ce725b6a-06bb-4339-819f-fee8819078f0" (UID: "ce725b6a-06bb-4339-819f-fee8819078f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.736825 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce725b6a-06bb-4339-819f-fee8819078f0" (UID: "ce725b6a-06bb-4339-819f-fee8819078f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.813690 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.813737 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce725b6a-06bb-4339-819f-fee8819078f0-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:32 crc kubenswrapper[4772]: I0127 16:41:32.813751 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ckmf\" (UniqueName: \"kubernetes.io/projected/ce725b6a-06bb-4339-819f-fee8819078f0-kube-api-access-7ckmf\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.097031 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.103030 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ce725b6a-06bb-4339-819f-fee8819078f0","Type":"ContainerDied","Data":"c5ffffe294d87ab27553633999a0d84408edaf82429449f00a94658621086fd0"} Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.103109 4772 scope.go:117] "RemoveContainer" containerID="33aa342145a07c52aa025581718207e0ed3316f1c48915a16075320a2dbeed5c" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.151698 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.175138 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.184705 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:41:33 crc kubenswrapper[4772]: E0127 16:41:33.185576 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="registry-server" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.185599 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="registry-server" Jan 27 16:41:33 crc kubenswrapper[4772]: E0127 16:41:33.185644 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="extract-content" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.185652 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="extract-content" Jan 27 16:41:33 crc kubenswrapper[4772]: E0127 16:41:33.185678 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce725b6a-06bb-4339-819f-fee8819078f0" containerName="nova-scheduler-scheduler" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.185688 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce725b6a-06bb-4339-819f-fee8819078f0" containerName="nova-scheduler-scheduler" Jan 27 16:41:33 crc kubenswrapper[4772]: E0127 16:41:33.185702 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="extract-utilities" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.185710 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="extract-utilities" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.186007 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e18a7da1-2037-446c-8646-76917bb9544b" containerName="registry-server" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.186030 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce725b6a-06bb-4339-819f-fee8819078f0" containerName="nova-scheduler-scheduler" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.188055 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.193692 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.195968 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.324239 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85lz2\" (UniqueName: \"kubernetes.io/projected/c9832ca2-4d35-4533-bdb3-7ac3773e5242-kube-api-access-85lz2\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.324302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9832ca2-4d35-4533-bdb3-7ac3773e5242-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.324324 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9832ca2-4d35-4533-bdb3-7ac3773e5242-config-data\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.425779 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85lz2\" (UniqueName: \"kubernetes.io/projected/c9832ca2-4d35-4533-bdb3-7ac3773e5242-kube-api-access-85lz2\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.426470 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9832ca2-4d35-4533-bdb3-7ac3773e5242-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.427425 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9832ca2-4d35-4533-bdb3-7ac3773e5242-config-data\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.431005 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9832ca2-4d35-4533-bdb3-7ac3773e5242-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.431373 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9832ca2-4d35-4533-bdb3-7ac3773e5242-config-data\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.448209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85lz2\" (UniqueName: \"kubernetes.io/projected/c9832ca2-4d35-4533-bdb3-7ac3773e5242-kube-api-access-85lz2\") pod \"nova-scheduler-0\" (UID: \"c9832ca2-4d35-4533-bdb3-7ac3773e5242\") " pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.509836 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 16:41:33 crc kubenswrapper[4772]: I0127 16:41:33.836927 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 16:41:34 crc kubenswrapper[4772]: I0127 16:41:34.109390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9832ca2-4d35-4533-bdb3-7ac3773e5242","Type":"ContainerStarted","Data":"65a9e9b4d2896ed39525d206b510fca660ca26b4570c062a20320a6ceaa82ab3"} Jan 27 16:41:34 crc kubenswrapper[4772]: I0127 16:41:34.109673 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c9832ca2-4d35-4533-bdb3-7ac3773e5242","Type":"ContainerStarted","Data":"8c64ae7d9f465a3928274a6bfabff3ff3e174b69628594eef3cbb9e8a9068613"} Jan 27 16:41:34 crc kubenswrapper[4772]: I0127 16:41:34.673599 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce725b6a-06bb-4339-819f-fee8819078f0" path="/var/lib/kubelet/pods/ce725b6a-06bb-4339-819f-fee8819078f0/volumes" Jan 27 16:41:34 crc kubenswrapper[4772]: I0127 16:41:34.728660 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:41:34 crc kubenswrapper[4772]: I0127 16:41:34.728818 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 16:41:35 crc kubenswrapper[4772]: I0127 16:41:35.143373 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.143352506 podStartE2EDuration="2.143352506s" podCreationTimestamp="2026-01-27 16:41:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:35.138715513 +0000 UTC m=+5681.119324641" watchObservedRunningTime="2026-01-27 16:41:35.143352506 +0000 UTC m=+5681.123961604" Jan 27 16:41:37 crc kubenswrapper[4772]: I0127 16:41:37.302464 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:37 crc kubenswrapper[4772]: I0127 16:41:37.314133 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:38 crc kubenswrapper[4772]: I0127 16:41:38.161368 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 16:41:38 crc kubenswrapper[4772]: I0127 16:41:38.511283 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 16:41:39 crc kubenswrapper[4772]: I0127 16:41:39.469779 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 16:41:39 crc kubenswrapper[4772]: I0127 16:41:39.470375 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 16:41:39 crc kubenswrapper[4772]: I0127 16:41:39.588595 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 16:41:39 crc kubenswrapper[4772]: I0127 16:41:39.727768 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 16:41:39 crc kubenswrapper[4772]: I0127 16:41:39.728002 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 16:41:39 crc kubenswrapper[4772]: I0127 16:41:39.878697 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 16:41:40 crc kubenswrapper[4772]: I0127 16:41:40.551426 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f178d7e3-af69-4014-8209-5e766a130997" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.75:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:41:40 crc kubenswrapper[4772]: I0127 16:41:40.551715 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f178d7e3-af69-4014-8209-5e766a130997" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.75:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:41:40 crc kubenswrapper[4772]: I0127 16:41:40.811393 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a983cf0-2c51-4d6a-af53-f115f3a57360" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:41:40 crc kubenswrapper[4772]: I0127 16:41:40.811421 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3a983cf0-2c51-4d6a-af53-f115f3a57360" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.77:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 16:41:42 crc kubenswrapper[4772]: I0127 16:41:42.058448 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:41:42 crc kubenswrapper[4772]: I0127 16:41:42.058748 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:41:42 crc kubenswrapper[4772]: I0127 16:41:42.058791 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:41:42 crc kubenswrapper[4772]: I0127 16:41:42.059529 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:41:42 crc kubenswrapper[4772]: I0127 16:41:42.059588 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" gracePeriod=600 Jan 27 16:41:42 crc kubenswrapper[4772]: E0127 16:41:42.189422 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.206600 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" exitCode=0 Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.207266 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69"} Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.207354 4772 scope.go:117] "RemoveContainer" containerID="90e27c06727cf113f54cd7c0344565bfa447b15cc343fc7033a04f41dddb22f9" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.212621 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:41:43 crc kubenswrapper[4772]: E0127 16:41:43.213715 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.231314 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.232846 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.238640 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.267284 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.312393 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.312467 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.312570 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zw7z\" (UniqueName: \"kubernetes.io/projected/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-kube-api-access-8zw7z\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.312596 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.312612 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.312649 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414034 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zw7z\" (UniqueName: \"kubernetes.io/projected/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-kube-api-access-8zw7z\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414094 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414112 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414214 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414250 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.414289 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.420956 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.421407 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.421421 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.422947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.435850 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zw7z\" (UniqueName: \"kubernetes.io/projected/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-kube-api-access-8zw7z\") pod \"cinder-scheduler-0\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.511339 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.540653 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 16:41:43 crc kubenswrapper[4772]: I0127 16:41:43.552897 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 16:41:44 crc kubenswrapper[4772]: W0127 16:41:44.200370 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb6ca922_71e5_4fa3_ae7b_5137b0e58397.slice/crio-9c8e4b7251239d087f479e7cdf47b4e5ff69599faa90d12e66a25ebf0083f84b WatchSource:0}: Error finding container 9c8e4b7251239d087f479e7cdf47b4e5ff69599faa90d12e66a25ebf0083f84b: Status 404 returned error can't find the container with id 9c8e4b7251239d087f479e7cdf47b4e5ff69599faa90d12e66a25ebf0083f84b Jan 27 16:41:44 crc kubenswrapper[4772]: I0127 16:41:44.200506 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:44 crc kubenswrapper[4772]: I0127 16:41:44.219576 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb6ca922-71e5-4fa3-ae7b-5137b0e58397","Type":"ContainerStarted","Data":"9c8e4b7251239d087f479e7cdf47b4e5ff69599faa90d12e66a25ebf0083f84b"} Jan 27 16:41:44 crc kubenswrapper[4772]: I0127 16:41:44.252527 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 16:41:44 crc kubenswrapper[4772]: I0127 16:41:44.725155 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:44 crc kubenswrapper[4772]: I0127 16:41:44.725797 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api-log" containerID="cri-o://714b9b6c5c6438e8d52919984e494ff614fd56b03d77fad32ca28a5a24726e46" gracePeriod=30 Jan 27 16:41:44 crc kubenswrapper[4772]: I0127 16:41:44.725881 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api" containerID="cri-o://31e3d55d1d113b922a3f84db51ffbe33b9ce1af8297930f881b9d6d32892374f" gracePeriod=30 Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.236159 4772 generic.go:334] "Generic (PLEG): container finished" podID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerID="714b9b6c5c6438e8d52919984e494ff614fd56b03d77fad32ca28a5a24726e46" exitCode=143 Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.236489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f","Type":"ContainerDied","Data":"714b9b6c5c6438e8d52919984e494ff614fd56b03d77fad32ca28a5a24726e46"} Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.240969 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb6ca922-71e5-4fa3-ae7b-5137b0e58397","Type":"ContainerStarted","Data":"be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3"} Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.262238 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.264239 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.268757 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.305662 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354564 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-dev\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354616 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354652 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354749 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354787 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354818 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354848 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354870 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354896 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-sys\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.354925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-run\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.355036 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.355065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.355087 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55dzs\" (UniqueName: \"kubernetes.io/projected/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-kube-api-access-55dzs\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.355111 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.355134 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457201 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457267 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457303 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457330 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457337 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457397 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457438 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-sys\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457473 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-run\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457474 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457499 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457518 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-run\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457536 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-sys\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457580 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457611 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457614 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457631 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55dzs\" (UniqueName: \"kubernetes.io/projected/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-kube-api-access-55dzs\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457660 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457681 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-dev\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457710 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457731 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457764 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457917 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-dev\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.457931 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.463525 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.465749 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.466148 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.470871 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.478717 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.481723 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55dzs\" (UniqueName: \"kubernetes.io/projected/7b6810b2-bc50-486d-9a87-cf4cd50d33c5-kube-api-access-55dzs\") pod \"cinder-volume-volume1-0\" (UID: \"7b6810b2-bc50-486d-9a87-cf4cd50d33c5\") " pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.589874 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.792006 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.795416 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.797958 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.811723 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869514 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37170132-cd9f-44e7-827d-b98486cefb39-ceph\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869635 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86z5c\" (UniqueName: \"kubernetes.io/projected/37170132-cd9f-44e7-827d-b98486cefb39-kube-api-access-86z5c\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869678 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-dev\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869710 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-scripts\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869741 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-lib-modules\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-nvme\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869802 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869851 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-sys\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869898 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-run\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869925 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-config-data-custom\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.869960 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.870006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.870030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-config-data\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.870074 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.870114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37170132-cd9f-44e7-827d-b98486cefb39-ceph\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971739 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86z5c\" (UniqueName: \"kubernetes.io/projected/37170132-cd9f-44e7-827d-b98486cefb39-kube-api-access-86z5c\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-dev\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971795 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-scripts\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-lib-modules\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971854 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-nvme\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971886 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971932 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-sys\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971968 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-run\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.971990 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-config-data-custom\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972022 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972063 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972084 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-config-data\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972130 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972314 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-sys\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972316 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972341 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-run\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972379 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-dev\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972403 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972679 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972731 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-lib-modules\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.972879 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/37170132-cd9f-44e7-827d-b98486cefb39-etc-nvme\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.976754 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/37170132-cd9f-44e7-827d-b98486cefb39-ceph\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.977621 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-scripts\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.977818 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-config-data-custom\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.992485 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-config-data\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.994386 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37170132-cd9f-44e7-827d-b98486cefb39-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:45 crc kubenswrapper[4772]: I0127 16:41:45.994632 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86z5c\" (UniqueName: \"kubernetes.io/projected/37170132-cd9f-44e7-827d-b98486cefb39-kube-api-access-86z5c\") pod \"cinder-backup-0\" (UID: \"37170132-cd9f-44e7-827d-b98486cefb39\") " pod="openstack/cinder-backup-0" Jan 27 16:41:46 crc kubenswrapper[4772]: I0127 16:41:46.128232 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 27 16:41:46 crc kubenswrapper[4772]: I0127 16:41:46.204501 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 27 16:41:46 crc kubenswrapper[4772]: W0127 16:41:46.213304 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b6810b2_bc50_486d_9a87_cf4cd50d33c5.slice/crio-f6537ff34af5a48afbc027cf852a989774ed686a8c36aef611b39b70b4d39635 WatchSource:0}: Error finding container f6537ff34af5a48afbc027cf852a989774ed686a8c36aef611b39b70b4d39635: Status 404 returned error can't find the container with id f6537ff34af5a48afbc027cf852a989774ed686a8c36aef611b39b70b4d39635 Jan 27 16:41:46 crc kubenswrapper[4772]: I0127 16:41:46.266008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb6ca922-71e5-4fa3-ae7b-5137b0e58397","Type":"ContainerStarted","Data":"ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef"} Jan 27 16:41:46 crc kubenswrapper[4772]: I0127 16:41:46.277433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"7b6810b2-bc50-486d-9a87-cf4cd50d33c5","Type":"ContainerStarted","Data":"f6537ff34af5a48afbc027cf852a989774ed686a8c36aef611b39b70b4d39635"} Jan 27 16:41:46 crc kubenswrapper[4772]: I0127 16:41:46.294541 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.2944987550000002 podStartE2EDuration="3.294498755s" podCreationTimestamp="2026-01-27 16:41:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:46.287590048 +0000 UTC m=+5692.268199146" watchObservedRunningTime="2026-01-27 16:41:46.294498755 +0000 UTC m=+5692.275107853" Jan 27 16:41:46 crc kubenswrapper[4772]: I0127 16:41:46.691388 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 27 16:41:47 crc kubenswrapper[4772]: I0127 16:41:47.299357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"37170132-cd9f-44e7-827d-b98486cefb39","Type":"ContainerStarted","Data":"22a8fa900207d0302c8abf2e457cf9376b2ee0a0d9600c94dbfb5249680a1692"} Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.316691 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"37170132-cd9f-44e7-827d-b98486cefb39","Type":"ContainerStarted","Data":"dc6e30a448451b97f1266ec3d9ad50ad6ae4ebb2ac72cb94a8a9cb432bf1f514"} Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.317621 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"37170132-cd9f-44e7-827d-b98486cefb39","Type":"ContainerStarted","Data":"a2c049f67758489a77e066df2431ee96e2428bbd04db38037a834c967aeda936"} Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.319408 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"7b6810b2-bc50-486d-9a87-cf4cd50d33c5","Type":"ContainerStarted","Data":"07fb537d1a94745b9dd392726b4097426196d8f9517eb850bd68fd5c08583620"} Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.319451 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"7b6810b2-bc50-486d-9a87-cf4cd50d33c5","Type":"ContainerStarted","Data":"356a72ff887fc4a895e6fe05bc640c45244dd9e2d55a5f7750e045dc0184530f"} Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.324233 4772 generic.go:334] "Generic (PLEG): container finished" podID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerID="31e3d55d1d113b922a3f84db51ffbe33b9ce1af8297930f881b9d6d32892374f" exitCode=0 Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.324281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f","Type":"ContainerDied","Data":"31e3d55d1d113b922a3f84db51ffbe33b9ce1af8297930f881b9d6d32892374f"} Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.349782 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.54014595 podStartE2EDuration="3.349737807s" podCreationTimestamp="2026-01-27 16:41:45 +0000 UTC" firstStartedPulling="2026-01-27 16:41:46.714420344 +0000 UTC m=+5692.695029432" lastFinishedPulling="2026-01-27 16:41:47.524012191 +0000 UTC m=+5693.504621289" observedRunningTime="2026-01-27 16:41:48.344451337 +0000 UTC m=+5694.325060425" watchObservedRunningTime="2026-01-27 16:41:48.349737807 +0000 UTC m=+5694.330346905" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.389912 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.443403814 podStartE2EDuration="3.389882692s" podCreationTimestamp="2026-01-27 16:41:45 +0000 UTC" firstStartedPulling="2026-01-27 16:41:46.215672518 +0000 UTC m=+5692.196281616" lastFinishedPulling="2026-01-27 16:41:47.162151396 +0000 UTC m=+5693.142760494" observedRunningTime="2026-01-27 16:41:48.382955414 +0000 UTC m=+5694.363564522" watchObservedRunningTime="2026-01-27 16:41:48.389882692 +0000 UTC m=+5694.370491790" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.430742 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.539778 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.540155 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-etc-machine-id\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.540234 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-logs\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.541063 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-combined-ca-bundle\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.541312 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.541665 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data-custom\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.541772 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf7qv\" (UniqueName: \"kubernetes.io/projected/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-kube-api-access-rf7qv\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.541938 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-scripts\") pod \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\" (UID: \"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f\") " Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.542203 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-logs" (OuterVolumeSpecName: "logs") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.543019 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.543035 4772 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-logs\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.546511 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-scripts" (OuterVolumeSpecName: "scripts") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.546967 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-kube-api-access-rf7qv" (OuterVolumeSpecName: "kube-api-access-rf7qv") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "kube-api-access-rf7qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.547598 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.554735 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.615888 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.643256 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data" (OuterVolumeSpecName: "config-data") pod "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" (UID: "1d692e3b-b4e1-4af1-8cb1-a64a6e51916f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.645456 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf7qv\" (UniqueName: \"kubernetes.io/projected/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-kube-api-access-rf7qv\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.645493 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.645508 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.645522 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:48 crc kubenswrapper[4772]: I0127 16:41:48.645535 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.339760 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.339763 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1d692e3b-b4e1-4af1-8cb1-a64a6e51916f","Type":"ContainerDied","Data":"1985a43f384e22d29818a239c3a060f96131fc47f07c59ec7571d217b8454dc4"} Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.340727 4772 scope.go:117] "RemoveContainer" containerID="31e3d55d1d113b922a3f84db51ffbe33b9ce1af8297930f881b9d6d32892374f" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.399087 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.404541 4772 scope.go:117] "RemoveContainer" containerID="714b9b6c5c6438e8d52919984e494ff614fd56b03d77fad32ca28a5a24726e46" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.409794 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.420015 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:49 crc kubenswrapper[4772]: E0127 16:41:49.420584 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.420614 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api" Jan 27 16:41:49 crc kubenswrapper[4772]: E0127 16:41:49.420635 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api-log" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.420646 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api-log" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.420865 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api-log" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.420884 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" containerName="cinder-api" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.422104 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.428480 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.428521 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.473653 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.473978 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.476043 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.480332 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.560475 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-scripts\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.561579 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlbg8\" (UniqueName: \"kubernetes.io/projected/a46febaf-97b6-4ed3-8958-316e2a542a5f-kube-api-access-wlbg8\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.561827 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a46febaf-97b6-4ed3-8958-316e2a542a5f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.561935 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.562016 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a46febaf-97b6-4ed3-8958-316e2a542a5f-logs\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.562130 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-config-data-custom\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.562255 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-config-data\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663248 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlbg8\" (UniqueName: \"kubernetes.io/projected/a46febaf-97b6-4ed3-8958-316e2a542a5f-kube-api-access-wlbg8\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663637 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a46febaf-97b6-4ed3-8958-316e2a542a5f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663664 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663683 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a46febaf-97b6-4ed3-8958-316e2a542a5f-logs\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663714 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-config-data-custom\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663740 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a46febaf-97b6-4ed3-8958-316e2a542a5f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663747 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-config-data\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.663910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-scripts\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.664807 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a46febaf-97b6-4ed3-8958-316e2a542a5f-logs\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.669602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.671010 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-config-data-custom\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.671144 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-config-data\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.680933 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a46febaf-97b6-4ed3-8958-316e2a542a5f-scripts\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.683814 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlbg8\" (UniqueName: \"kubernetes.io/projected/a46febaf-97b6-4ed3-8958-316e2a542a5f-kube-api-access-wlbg8\") pod \"cinder-api-0\" (UID: \"a46febaf-97b6-4ed3-8958-316e2a542a5f\") " pod="openstack/cinder-api-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.730663 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.731474 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.732530 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 16:41:49 crc kubenswrapper[4772]: I0127 16:41:49.752559 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.207766 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.352733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a46febaf-97b6-4ed3-8958-316e2a542a5f","Type":"ContainerStarted","Data":"287b252ce68976793728fff56a71a7cca1bfb0340ced44587ebaf86607e9529a"} Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.353653 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.355589 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.357433 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.593389 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:50 crc kubenswrapper[4772]: I0127 16:41:50.681011 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d692e3b-b4e1-4af1-8cb1-a64a6e51916f" path="/var/lib/kubelet/pods/1d692e3b-b4e1-4af1-8cb1-a64a6e51916f/volumes" Jan 27 16:41:51 crc kubenswrapper[4772]: I0127 16:41:51.129553 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 27 16:41:51 crc kubenswrapper[4772]: I0127 16:41:51.367801 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a46febaf-97b6-4ed3-8958-316e2a542a5f","Type":"ContainerStarted","Data":"d7165bee75b99f5d5e7f1a0cfe3be5a235bea586658dcfb51148e8422a49f2ba"} Jan 27 16:41:52 crc kubenswrapper[4772]: I0127 16:41:52.378123 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a46febaf-97b6-4ed3-8958-316e2a542a5f","Type":"ContainerStarted","Data":"bebfac65418099316deacbc1c27d9c9d4cefb5cd8c9460548100c4cee37449e7"} Jan 27 16:41:52 crc kubenswrapper[4772]: I0127 16:41:52.402002 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.4019864220000002 podStartE2EDuration="3.401986422s" podCreationTimestamp="2026-01-27 16:41:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:52.39911032 +0000 UTC m=+5698.379719438" watchObservedRunningTime="2026-01-27 16:41:52.401986422 +0000 UTC m=+5698.382595520" Jan 27 16:41:53 crc kubenswrapper[4772]: I0127 16:41:53.385642 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 16:41:53 crc kubenswrapper[4772]: I0127 16:41:53.783740 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 16:41:53 crc kubenswrapper[4772]: I0127 16:41:53.841398 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:54 crc kubenswrapper[4772]: I0127 16:41:54.399380 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="cinder-scheduler" containerID="cri-o://be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3" gracePeriod=30 Jan 27 16:41:54 crc kubenswrapper[4772]: I0127 16:41:54.400485 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="probe" containerID="cri-o://ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef" gracePeriod=30 Jan 27 16:41:54 crc kubenswrapper[4772]: I0127 16:41:54.682976 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:41:54 crc kubenswrapper[4772]: E0127 16:41:54.683440 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:41:55 crc kubenswrapper[4772]: I0127 16:41:55.415337 4772 generic.go:334] "Generic (PLEG): container finished" podID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerID="ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef" exitCode=0 Jan 27 16:41:55 crc kubenswrapper[4772]: I0127 16:41:55.415458 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb6ca922-71e5-4fa3-ae7b-5137b0e58397","Type":"ContainerDied","Data":"ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef"} Jan 27 16:41:55 crc kubenswrapper[4772]: I0127 16:41:55.785241 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 27 16:41:56 crc kubenswrapper[4772]: I0127 16:41:56.349334 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 27 16:41:56 crc kubenswrapper[4772]: I0127 16:41:56.971248 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114219 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data\") pod \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114305 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data-custom\") pod \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114362 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zw7z\" (UniqueName: \"kubernetes.io/projected/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-kube-api-access-8zw7z\") pod \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114407 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-combined-ca-bundle\") pod \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114430 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-etc-machine-id\") pod \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114545 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-scripts\") pod \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\" (UID: \"cb6ca922-71e5-4fa3-ae7b-5137b0e58397\") " Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.114665 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cb6ca922-71e5-4fa3-ae7b-5137b0e58397" (UID: "cb6ca922-71e5-4fa3-ae7b-5137b0e58397"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.115103 4772 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.123430 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-kube-api-access-8zw7z" (OuterVolumeSpecName: "kube-api-access-8zw7z") pod "cb6ca922-71e5-4fa3-ae7b-5137b0e58397" (UID: "cb6ca922-71e5-4fa3-ae7b-5137b0e58397"). InnerVolumeSpecName "kube-api-access-8zw7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.123465 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-scripts" (OuterVolumeSpecName: "scripts") pod "cb6ca922-71e5-4fa3-ae7b-5137b0e58397" (UID: "cb6ca922-71e5-4fa3-ae7b-5137b0e58397"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.123430 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb6ca922-71e5-4fa3-ae7b-5137b0e58397" (UID: "cb6ca922-71e5-4fa3-ae7b-5137b0e58397"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.165900 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb6ca922-71e5-4fa3-ae7b-5137b0e58397" (UID: "cb6ca922-71e5-4fa3-ae7b-5137b0e58397"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.216604 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.216635 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.216646 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zw7z\" (UniqueName: \"kubernetes.io/projected/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-kube-api-access-8zw7z\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.216659 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.228813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data" (OuterVolumeSpecName: "config-data") pod "cb6ca922-71e5-4fa3-ae7b-5137b0e58397" (UID: "cb6ca922-71e5-4fa3-ae7b-5137b0e58397"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.318059 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb6ca922-71e5-4fa3-ae7b-5137b0e58397-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.437493 4772 generic.go:334] "Generic (PLEG): container finished" podID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerID="be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3" exitCode=0 Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.437540 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb6ca922-71e5-4fa3-ae7b-5137b0e58397","Type":"ContainerDied","Data":"be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3"} Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.437568 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb6ca922-71e5-4fa3-ae7b-5137b0e58397","Type":"ContainerDied","Data":"9c8e4b7251239d087f479e7cdf47b4e5ff69599faa90d12e66a25ebf0083f84b"} Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.437590 4772 scope.go:117] "RemoveContainer" containerID="ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.437591 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.473087 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.474901 4772 scope.go:117] "RemoveContainer" containerID="be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.483987 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.495619 4772 scope.go:117] "RemoveContainer" containerID="ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef" Jan 27 16:41:57 crc kubenswrapper[4772]: E0127 16:41:57.496438 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef\": container with ID starting with ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef not found: ID does not exist" containerID="ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.496595 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef"} err="failed to get container status \"ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef\": rpc error: code = NotFound desc = could not find container \"ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef\": container with ID starting with ece33e666a3e58ee419da4cf74156d864bda223101f1f560e453f6c6c89635ef not found: ID does not exist" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.496733 4772 scope.go:117] "RemoveContainer" containerID="be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3" Jan 27 16:41:57 crc kubenswrapper[4772]: E0127 16:41:57.497289 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3\": container with ID starting with be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3 not found: ID does not exist" containerID="be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.497349 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3"} err="failed to get container status \"be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3\": rpc error: code = NotFound desc = could not find container \"be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3\": container with ID starting with be1b9f8d0c257a0e1d0419cf2c1278df0e274331eb3adeef663b50de9e49f2f3 not found: ID does not exist" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.507631 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:57 crc kubenswrapper[4772]: E0127 16:41:57.508066 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="cinder-scheduler" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.508090 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="cinder-scheduler" Jan 27 16:41:57 crc kubenswrapper[4772]: E0127 16:41:57.508120 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="probe" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.508128 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="probe" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.508381 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="probe" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.508408 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" containerName="cinder-scheduler" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.511493 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.513560 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.519855 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.623012 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66512484-80ba-4887-b9a9-9cc87a65ad18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.623112 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.623130 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-scripts\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.623153 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-config-data\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.623259 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vtw\" (UniqueName: \"kubernetes.io/projected/66512484-80ba-4887-b9a9-9cc87a65ad18-kube-api-access-m6vtw\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.623281 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.726400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vtw\" (UniqueName: \"kubernetes.io/projected/66512484-80ba-4887-b9a9-9cc87a65ad18-kube-api-access-m6vtw\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.726479 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.726527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66512484-80ba-4887-b9a9-9cc87a65ad18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.726741 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.726786 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-scripts\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.726836 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-config-data\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.728343 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/66512484-80ba-4887-b9a9-9cc87a65ad18-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.733144 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-scripts\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.733243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-config-data\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.733821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.737352 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/66512484-80ba-4887-b9a9-9cc87a65ad18-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.760363 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vtw\" (UniqueName: \"kubernetes.io/projected/66512484-80ba-4887-b9a9-9cc87a65ad18-kube-api-access-m6vtw\") pod \"cinder-scheduler-0\" (UID: \"66512484-80ba-4887-b9a9-9cc87a65ad18\") " pod="openstack/cinder-scheduler-0" Jan 27 16:41:57 crc kubenswrapper[4772]: I0127 16:41:57.847209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 16:41:58 crc kubenswrapper[4772]: I0127 16:41:58.299630 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 16:41:58 crc kubenswrapper[4772]: W0127 16:41:58.306186 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66512484_80ba_4887_b9a9_9cc87a65ad18.slice/crio-985270c438f483f088dfdeaaea8bfdcc4d8df2c83364557f3669e0b747fb7c38 WatchSource:0}: Error finding container 985270c438f483f088dfdeaaea8bfdcc4d8df2c83364557f3669e0b747fb7c38: Status 404 returned error can't find the container with id 985270c438f483f088dfdeaaea8bfdcc4d8df2c83364557f3669e0b747fb7c38 Jan 27 16:41:58 crc kubenswrapper[4772]: I0127 16:41:58.448904 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"66512484-80ba-4887-b9a9-9cc87a65ad18","Type":"ContainerStarted","Data":"985270c438f483f088dfdeaaea8bfdcc4d8df2c83364557f3669e0b747fb7c38"} Jan 27 16:41:58 crc kubenswrapper[4772]: I0127 16:41:58.674124 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6ca922-71e5-4fa3-ae7b-5137b0e58397" path="/var/lib/kubelet/pods/cb6ca922-71e5-4fa3-ae7b-5137b0e58397/volumes" Jan 27 16:41:59 crc kubenswrapper[4772]: I0127 16:41:59.476638 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"66512484-80ba-4887-b9a9-9cc87a65ad18","Type":"ContainerStarted","Data":"4190615e9df1af6789d9abb9ce07935a57894feb9b066d1087504e28e29c1ae6"} Jan 27 16:41:59 crc kubenswrapper[4772]: I0127 16:41:59.477030 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"66512484-80ba-4887-b9a9-9cc87a65ad18","Type":"ContainerStarted","Data":"8998b51dedfa1e5b10014c825ed5e1f50ffed9f00543215a444700336d946133"} Jan 27 16:41:59 crc kubenswrapper[4772]: I0127 16:41:59.506966 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.506947911 podStartE2EDuration="2.506947911s" podCreationTimestamp="2026-01-27 16:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:41:59.504581053 +0000 UTC m=+5705.485190161" watchObservedRunningTime="2026-01-27 16:41:59.506947911 +0000 UTC m=+5705.487557009" Jan 27 16:42:01 crc kubenswrapper[4772]: I0127 16:42:01.553594 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 16:42:02 crc kubenswrapper[4772]: I0127 16:42:02.847918 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 16:42:08 crc kubenswrapper[4772]: I0127 16:42:08.054581 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 16:42:09 crc kubenswrapper[4772]: I0127 16:42:09.664483 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:42:09 crc kubenswrapper[4772]: E0127 16:42:09.664786 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:42:22 crc kubenswrapper[4772]: I0127 16:42:22.664447 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:42:22 crc kubenswrapper[4772]: E0127 16:42:22.666017 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:42:35 crc kubenswrapper[4772]: I0127 16:42:35.663193 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:42:35 crc kubenswrapper[4772]: E0127 16:42:35.664043 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:42:48 crc kubenswrapper[4772]: I0127 16:42:48.662913 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:42:48 crc kubenswrapper[4772]: E0127 16:42:48.663808 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.221201 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2zjm2"] Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.223956 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.229857 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2zjm2"] Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.302625 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-utilities\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.302952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-catalog-content\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.303051 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6phf\" (UniqueName: \"kubernetes.io/projected/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-kube-api-access-w6phf\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.404798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6phf\" (UniqueName: \"kubernetes.io/projected/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-kube-api-access-w6phf\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.404912 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-utilities\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.404942 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-catalog-content\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.405636 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-catalog-content\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.405725 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-utilities\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.428059 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6phf\" (UniqueName: \"kubernetes.io/projected/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-kube-api-access-w6phf\") pod \"certified-operators-2zjm2\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:42:59 crc kubenswrapper[4772]: I0127 16:42:59.559999 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:00 crc kubenswrapper[4772]: I0127 16:43:00.175427 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2zjm2"] Jan 27 16:43:01 crc kubenswrapper[4772]: I0127 16:43:01.019032 4772 generic.go:334] "Generic (PLEG): container finished" podID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerID="3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a" exitCode=0 Jan 27 16:43:01 crc kubenswrapper[4772]: I0127 16:43:01.019101 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerDied","Data":"3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a"} Jan 27 16:43:01 crc kubenswrapper[4772]: I0127 16:43:01.019493 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerStarted","Data":"ebcd2a276f29c5a3831867dbda33105f239b6023bcb91f062252b087f3f0823f"} Jan 27 16:43:01 crc kubenswrapper[4772]: I0127 16:43:01.664152 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:43:01 crc kubenswrapper[4772]: E0127 16:43:01.664681 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:43:03 crc kubenswrapper[4772]: I0127 16:43:03.038428 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerStarted","Data":"603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b"} Jan 27 16:43:04 crc kubenswrapper[4772]: I0127 16:43:04.048325 4772 generic.go:334] "Generic (PLEG): container finished" podID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerID="603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b" exitCode=0 Jan 27 16:43:04 crc kubenswrapper[4772]: I0127 16:43:04.048429 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerDied","Data":"603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b"} Jan 27 16:43:05 crc kubenswrapper[4772]: I0127 16:43:05.060712 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerStarted","Data":"d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d"} Jan 27 16:43:05 crc kubenswrapper[4772]: I0127 16:43:05.088361 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2zjm2" podStartSLOduration=2.5869534119999997 podStartE2EDuration="6.088337615s" podCreationTimestamp="2026-01-27 16:42:59 +0000 UTC" firstStartedPulling="2026-01-27 16:43:01.022244837 +0000 UTC m=+5767.002853935" lastFinishedPulling="2026-01-27 16:43:04.52362904 +0000 UTC m=+5770.504238138" observedRunningTime="2026-01-27 16:43:05.077387143 +0000 UTC m=+5771.057996241" watchObservedRunningTime="2026-01-27 16:43:05.088337615 +0000 UTC m=+5771.068946713" Jan 27 16:43:07 crc kubenswrapper[4772]: I0127 16:43:07.483415 4772 scope.go:117] "RemoveContainer" containerID="f3387679ac491bfa95cee52f7df561b5ae6817e29d93d8687efd8b7b43af3938" Jan 27 16:43:07 crc kubenswrapper[4772]: I0127 16:43:07.504004 4772 scope.go:117] "RemoveContainer" containerID="71552776ddecc62c8f2d51f59b6ec4b6ba66501bb9fcfb2256ecea66da9231d2" Jan 27 16:43:09 crc kubenswrapper[4772]: I0127 16:43:09.561096 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:09 crc kubenswrapper[4772]: I0127 16:43:09.561414 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:09 crc kubenswrapper[4772]: I0127 16:43:09.609062 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:10 crc kubenswrapper[4772]: I0127 16:43:10.150854 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:10 crc kubenswrapper[4772]: I0127 16:43:10.201910 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2zjm2"] Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.121155 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2zjm2" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="registry-server" containerID="cri-o://d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d" gracePeriod=2 Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.596453 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.660443 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-utilities\") pod \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.660564 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6phf\" (UniqueName: \"kubernetes.io/projected/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-kube-api-access-w6phf\") pod \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.660667 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-catalog-content\") pod \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\" (UID: \"459295b8-ff6b-4e68-86ef-2db5ccb48ebd\") " Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.661368 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-utilities" (OuterVolumeSpecName: "utilities") pod "459295b8-ff6b-4e68-86ef-2db5ccb48ebd" (UID: "459295b8-ff6b-4e68-86ef-2db5ccb48ebd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.665523 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-kube-api-access-w6phf" (OuterVolumeSpecName: "kube-api-access-w6phf") pod "459295b8-ff6b-4e68-86ef-2db5ccb48ebd" (UID: "459295b8-ff6b-4e68-86ef-2db5ccb48ebd"). InnerVolumeSpecName "kube-api-access-w6phf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.709419 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "459295b8-ff6b-4e68-86ef-2db5ccb48ebd" (UID: "459295b8-ff6b-4e68-86ef-2db5ccb48ebd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.762937 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.762967 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:12 crc kubenswrapper[4772]: I0127 16:43:12.762978 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6phf\" (UniqueName: \"kubernetes.io/projected/459295b8-ff6b-4e68-86ef-2db5ccb48ebd-kube-api-access-w6phf\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.131695 4772 generic.go:334] "Generic (PLEG): container finished" podID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerID="d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d" exitCode=0 Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.131740 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerDied","Data":"d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d"} Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.131772 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2zjm2" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.131782 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2zjm2" event={"ID":"459295b8-ff6b-4e68-86ef-2db5ccb48ebd","Type":"ContainerDied","Data":"ebcd2a276f29c5a3831867dbda33105f239b6023bcb91f062252b087f3f0823f"} Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.131807 4772 scope.go:117] "RemoveContainer" containerID="d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.154477 4772 scope.go:117] "RemoveContainer" containerID="603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.181820 4772 scope.go:117] "RemoveContainer" containerID="3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.185224 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2zjm2"] Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.195245 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2zjm2"] Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.218810 4772 scope.go:117] "RemoveContainer" containerID="d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d" Jan 27 16:43:13 crc kubenswrapper[4772]: E0127 16:43:13.219454 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d\": container with ID starting with d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d not found: ID does not exist" containerID="d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.219516 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d"} err="failed to get container status \"d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d\": rpc error: code = NotFound desc = could not find container \"d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d\": container with ID starting with d32818b18c7ff11f1c88b751ea1897cc762fa68917a75c87760f093e1d49259d not found: ID does not exist" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.219559 4772 scope.go:117] "RemoveContainer" containerID="603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b" Jan 27 16:43:13 crc kubenswrapper[4772]: E0127 16:43:13.220034 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b\": container with ID starting with 603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b not found: ID does not exist" containerID="603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.220075 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b"} err="failed to get container status \"603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b\": rpc error: code = NotFound desc = could not find container \"603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b\": container with ID starting with 603104fcf1485a95a1485cbf6e7d7243393b9a93d0ca768f32888f4b677dac6b not found: ID does not exist" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.220117 4772 scope.go:117] "RemoveContainer" containerID="3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a" Jan 27 16:43:13 crc kubenswrapper[4772]: E0127 16:43:13.220580 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a\": container with ID starting with 3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a not found: ID does not exist" containerID="3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a" Jan 27 16:43:13 crc kubenswrapper[4772]: I0127 16:43:13.220636 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a"} err="failed to get container status \"3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a\": rpc error: code = NotFound desc = could not find container \"3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a\": container with ID starting with 3f844d16bdc9ff2e0498bc2f66feb122a928a2a7de272c003efb7d14c3fb197a not found: ID does not exist" Jan 27 16:43:14 crc kubenswrapper[4772]: I0127 16:43:14.670088 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:43:14 crc kubenswrapper[4772]: E0127 16:43:14.670366 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:43:14 crc kubenswrapper[4772]: I0127 16:43:14.674402 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" path="/var/lib/kubelet/pods/459295b8-ff6b-4e68-86ef-2db5ccb48ebd/volumes" Jan 27 16:43:26 crc kubenswrapper[4772]: I0127 16:43:26.663386 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:43:26 crc kubenswrapper[4772]: E0127 16:43:26.664564 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:43:31 crc kubenswrapper[4772]: I0127 16:43:31.041869 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-jb5ch"] Jan 27 16:43:31 crc kubenswrapper[4772]: I0127 16:43:31.052663 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-jb5ch"] Jan 27 16:43:32 crc kubenswrapper[4772]: I0127 16:43:32.028206 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0c73-account-create-update-nz44z"] Jan 27 16:43:32 crc kubenswrapper[4772]: I0127 16:43:32.036311 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0c73-account-create-update-nz44z"] Jan 27 16:43:32 crc kubenswrapper[4772]: I0127 16:43:32.675265 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5292a043-9ee5-4d14-a991-c50dbf4d136e" path="/var/lib/kubelet/pods/5292a043-9ee5-4d14-a991-c50dbf4d136e/volumes" Jan 27 16:43:32 crc kubenswrapper[4772]: I0127 16:43:32.676052 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e8906f-73a7-4580-81c8-ec81439faea5" path="/var/lib/kubelet/pods/d7e8906f-73a7-4580-81c8-ec81439faea5/volumes" Jan 27 16:43:37 crc kubenswrapper[4772]: I0127 16:43:37.057637 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ls87v"] Jan 27 16:43:37 crc kubenswrapper[4772]: I0127 16:43:37.066257 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ls87v"] Jan 27 16:43:38 crc kubenswrapper[4772]: I0127 16:43:38.677924 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e179bb8-f9e4-434d-9636-84cc97d632fb" path="/var/lib/kubelet/pods/9e179bb8-f9e4-434d-9636-84cc97d632fb/volumes" Jan 27 16:43:41 crc kubenswrapper[4772]: I0127 16:43:41.663717 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:43:41 crc kubenswrapper[4772]: E0127 16:43:41.664299 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.937353 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-s5xbr"] Jan 27 16:43:47 crc kubenswrapper[4772]: E0127 16:43:47.970627 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="extract-utilities" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.970674 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="extract-utilities" Jan 27 16:43:47 crc kubenswrapper[4772]: E0127 16:43:47.970725 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="extract-content" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.970734 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="extract-content" Jan 27 16:43:47 crc kubenswrapper[4772]: E0127 16:43:47.970753 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="registry-server" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.970761 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="registry-server" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.971275 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="459295b8-ff6b-4e68-86ef-2db5ccb48ebd" containerName="registry-server" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.975943 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.979230 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.979479 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-zmzrv" Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.990116 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jv694"] Jan 27 16:43:47 crc kubenswrapper[4772]: I0127 16:43:47.992824 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.011755 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-s5xbr"] Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.025980 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jv694"] Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.051820 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-run\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.051875 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-log\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.051928 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-run\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052021 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxsl\" (UniqueName: \"kubernetes.io/projected/7e5eabb2-229a-4d75-b62d-65be688f753a-kube-api-access-xvxsl\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052063 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-lib\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052092 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-etc-ovs\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052117 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-run-ovn\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052451 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/febb140e-d26e-43db-9924-0f06739b9a4a-scripts\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052489 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-log-ovn\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052534 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e5eabb2-229a-4d75-b62d-65be688f753a-scripts\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.052562 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bh9\" (UniqueName: \"kubernetes.io/projected/febb140e-d26e-43db-9924-0f06739b9a4a-kube-api-access-p5bh9\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.153986 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/febb140e-d26e-43db-9924-0f06739b9a4a-scripts\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154124 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-log-ovn\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154189 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e5eabb2-229a-4d75-b62d-65be688f753a-scripts\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bh9\" (UniqueName: \"kubernetes.io/projected/febb140e-d26e-43db-9924-0f06739b9a4a-kube-api-access-p5bh9\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154227 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-run\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154264 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-log\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154297 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-run\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154397 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxsl\" (UniqueName: \"kubernetes.io/projected/7e5eabb2-229a-4d75-b62d-65be688f753a-kube-api-access-xvxsl\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154436 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-lib\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154459 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-etc-ovs\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154505 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-run-ovn\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154930 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-run\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.155015 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-etc-ovs\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.155034 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-run\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-log\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154978 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-log-ovn\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.154933 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/febb140e-d26e-43db-9924-0f06739b9a4a-var-run-ovn\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.155040 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7e5eabb2-229a-4d75-b62d-65be688f753a-var-lib\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.156869 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/febb140e-d26e-43db-9924-0f06739b9a4a-scripts\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.157248 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e5eabb2-229a-4d75-b62d-65be688f753a-scripts\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.177261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bh9\" (UniqueName: \"kubernetes.io/projected/febb140e-d26e-43db-9924-0f06739b9a4a-kube-api-access-p5bh9\") pod \"ovn-controller-jv694\" (UID: \"febb140e-d26e-43db-9924-0f06739b9a4a\") " pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.177398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxsl\" (UniqueName: \"kubernetes.io/projected/7e5eabb2-229a-4d75-b62d-65be688f753a-kube-api-access-xvxsl\") pod \"ovn-controller-ovs-s5xbr\" (UID: \"7e5eabb2-229a-4d75-b62d-65be688f753a\") " pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.326185 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.336978 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jv694" Jan 27 16:43:48 crc kubenswrapper[4772]: I0127 16:43:48.978619 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jv694"] Jan 27 16:43:49 crc kubenswrapper[4772]: I0127 16:43:49.227344 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-s5xbr"] Jan 27 16:43:49 crc kubenswrapper[4772]: W0127 16:43:49.242107 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e5eabb2_229a_4d75_b62d_65be688f753a.slice/crio-5a385b760c7c2a71eac14dab5e06934c0c974193b8985265d741d828f09cd81e WatchSource:0}: Error finding container 5a385b760c7c2a71eac14dab5e06934c0c974193b8985265d741d828f09cd81e: Status 404 returned error can't find the container with id 5a385b760c7c2a71eac14dab5e06934c0c974193b8985265d741d828f09cd81e Jan 27 16:43:49 crc kubenswrapper[4772]: I0127 16:43:49.525238 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s5xbr" event={"ID":"7e5eabb2-229a-4d75-b62d-65be688f753a","Type":"ContainerStarted","Data":"5a385b760c7c2a71eac14dab5e06934c0c974193b8985265d741d828f09cd81e"} Jan 27 16:43:49 crc kubenswrapper[4772]: I0127 16:43:49.527255 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jv694" event={"ID":"febb140e-d26e-43db-9924-0f06739b9a4a","Type":"ContainerStarted","Data":"dd533541b3646484c7bf20aa3cbde14d63fba54329547e4d5b3fb8effd1c14d4"} Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.491145 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-sx2ff"] Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.492597 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.494864 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.499423 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-config\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.499473 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-ovn-rundir\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.499507 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-ovs-rundir\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.499546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7wr\" (UniqueName: \"kubernetes.io/projected/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-kube-api-access-mj7wr\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.512864 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sx2ff"] Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.541828 4772 generic.go:334] "Generic (PLEG): container finished" podID="7e5eabb2-229a-4d75-b62d-65be688f753a" containerID="e9cc226b04b7fdd0ec9bbe67c4798430f89dc91b6835ece11ecc4a31026a5e90" exitCode=0 Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.542152 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s5xbr" event={"ID":"7e5eabb2-229a-4d75-b62d-65be688f753a","Type":"ContainerDied","Data":"e9cc226b04b7fdd0ec9bbe67c4798430f89dc91b6835ece11ecc4a31026a5e90"} Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.544982 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jv694" event={"ID":"febb140e-d26e-43db-9924-0f06739b9a4a","Type":"ContainerStarted","Data":"95a19528aa0ed75e0d341a0f15c898fc3ac16cf14ea038307b7611297762c3e3"} Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.545293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-jv694" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.601367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-config\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.601433 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-ovn-rundir\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.601471 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-ovs-rundir\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.601538 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7wr\" (UniqueName: \"kubernetes.io/projected/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-kube-api-access-mj7wr\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.603453 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-ovn-rundir\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.603516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-ovs-rundir\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.605983 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-config\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.607726 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-jv694" podStartSLOduration=3.607707312 podStartE2EDuration="3.607707312s" podCreationTimestamp="2026-01-27 16:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:43:50.600101355 +0000 UTC m=+5816.580710473" watchObservedRunningTime="2026-01-27 16:43:50.607707312 +0000 UTC m=+5816.588316410" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.638290 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7wr\" (UniqueName: \"kubernetes.io/projected/3d12bbd4-3d0b-444b-a462-b620a7a5d73d-kube-api-access-mj7wr\") pod \"ovn-controller-metrics-sx2ff\" (UID: \"3d12bbd4-3d0b-444b-a462-b620a7a5d73d\") " pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.690230 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-fw49r"] Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.691831 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.711924 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-fw49r"] Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.805001 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ccfn\" (UniqueName: \"kubernetes.io/projected/6985c580-efad-46fe-8e20-9f932ce3af7d-kube-api-access-9ccfn\") pod \"octavia-db-create-fw49r\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.805400 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6985c580-efad-46fe-8e20-9f932ce3af7d-operator-scripts\") pod \"octavia-db-create-fw49r\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.826536 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sx2ff" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.907239 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6985c580-efad-46fe-8e20-9f932ce3af7d-operator-scripts\") pod \"octavia-db-create-fw49r\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.907423 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ccfn\" (UniqueName: \"kubernetes.io/projected/6985c580-efad-46fe-8e20-9f932ce3af7d-kube-api-access-9ccfn\") pod \"octavia-db-create-fw49r\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.909344 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6985c580-efad-46fe-8e20-9f932ce3af7d-operator-scripts\") pod \"octavia-db-create-fw49r\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:50 crc kubenswrapper[4772]: I0127 16:43:50.928988 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ccfn\" (UniqueName: \"kubernetes.io/projected/6985c580-efad-46fe-8e20-9f932ce3af7d-kube-api-access-9ccfn\") pod \"octavia-db-create-fw49r\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.017523 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.286161 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sx2ff"] Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.483771 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-fw49r"] Jan 27 16:43:51 crc kubenswrapper[4772]: W0127 16:43:51.486953 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6985c580_efad_46fe_8e20_9f932ce3af7d.slice/crio-c36e42e52f193c7a6be7631f052325519c9df5bcaa5b03bea3f71227fc45caeb WatchSource:0}: Error finding container c36e42e52f193c7a6be7631f052325519c9df5bcaa5b03bea3f71227fc45caeb: Status 404 returned error can't find the container with id c36e42e52f193c7a6be7631f052325519c9df5bcaa5b03bea3f71227fc45caeb Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.559746 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s5xbr" event={"ID":"7e5eabb2-229a-4d75-b62d-65be688f753a","Type":"ContainerStarted","Data":"ddaa40cdf12e5368b170788f23af626c70d6c92a1e1f8bea4c264cf7a5e59c60"} Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.560074 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.560091 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-s5xbr" event={"ID":"7e5eabb2-229a-4d75-b62d-65be688f753a","Type":"ContainerStarted","Data":"9a5a9cf84dee4ca22e579aea9eb7f9bdecc66933783caa6f5694692162c4c67b"} Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.560104 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.563000 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-fw49r" event={"ID":"6985c580-efad-46fe-8e20-9f932ce3af7d","Type":"ContainerStarted","Data":"c36e42e52f193c7a6be7631f052325519c9df5bcaa5b03bea3f71227fc45caeb"} Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.564598 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sx2ff" event={"ID":"3d12bbd4-3d0b-444b-a462-b620a7a5d73d","Type":"ContainerStarted","Data":"56265457e5f04d0ed907eafc6fe108ad439fc0cc44816cbf12164bb2fecad158"} Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.564623 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sx2ff" event={"ID":"3d12bbd4-3d0b-444b-a462-b620a7a5d73d","Type":"ContainerStarted","Data":"e8d8314c18e87a69cb5868bf30d42a1cdbe11175a97f8adcf90b32a51c767f89"} Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.585293 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-s5xbr" podStartSLOduration=4.585271596 podStartE2EDuration="4.585271596s" podCreationTimestamp="2026-01-27 16:43:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:43:51.581885049 +0000 UTC m=+5817.562494157" watchObservedRunningTime="2026-01-27 16:43:51.585271596 +0000 UTC m=+5817.565880704" Jan 27 16:43:51 crc kubenswrapper[4772]: I0127 16:43:51.611017 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-sx2ff" podStartSLOduration=1.6110012889999998 podStartE2EDuration="1.611001289s" podCreationTimestamp="2026-01-27 16:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:43:51.605314457 +0000 UTC m=+5817.585923555" watchObservedRunningTime="2026-01-27 16:43:51.611001289 +0000 UTC m=+5817.591610377" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.029386 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8lxch"] Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.039614 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8lxch"] Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.406603 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c028-account-create-update-x9k2v"] Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.408100 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.411080 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.421246 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c028-account-create-update-x9k2v"] Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.545002 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd343bc-4a97-4ad2-aa82-12ea527398d8-operator-scripts\") pod \"octavia-c028-account-create-update-x9k2v\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.545074 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt2lh\" (UniqueName: \"kubernetes.io/projected/1cd343bc-4a97-4ad2-aa82-12ea527398d8-kube-api-access-mt2lh\") pod \"octavia-c028-account-create-update-x9k2v\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.578598 4772 generic.go:334] "Generic (PLEG): container finished" podID="6985c580-efad-46fe-8e20-9f932ce3af7d" containerID="742607c6bfa7c1d2a5f8ceee5a81852f3b14a71713e837fcc9210fdc6dccc556" exitCode=0 Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.578660 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-fw49r" event={"ID":"6985c580-efad-46fe-8e20-9f932ce3af7d","Type":"ContainerDied","Data":"742607c6bfa7c1d2a5f8ceee5a81852f3b14a71713e837fcc9210fdc6dccc556"} Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.646832 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt2lh\" (UniqueName: \"kubernetes.io/projected/1cd343bc-4a97-4ad2-aa82-12ea527398d8-kube-api-access-mt2lh\") pod \"octavia-c028-account-create-update-x9k2v\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.646884 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd343bc-4a97-4ad2-aa82-12ea527398d8-operator-scripts\") pod \"octavia-c028-account-create-update-x9k2v\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.647673 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd343bc-4a97-4ad2-aa82-12ea527398d8-operator-scripts\") pod \"octavia-c028-account-create-update-x9k2v\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.673088 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt2lh\" (UniqueName: \"kubernetes.io/projected/1cd343bc-4a97-4ad2-aa82-12ea527398d8-kube-api-access-mt2lh\") pod \"octavia-c028-account-create-update-x9k2v\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.677282 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b64159-de94-4b31-85ce-b845fdb391b3" path="/var/lib/kubelet/pods/e4b64159-de94-4b31-85ce-b845fdb391b3/volumes" Jan 27 16:43:52 crc kubenswrapper[4772]: I0127 16:43:52.726048 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:53 crc kubenswrapper[4772]: W0127 16:43:53.225019 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cd343bc_4a97_4ad2_aa82_12ea527398d8.slice/crio-b8a814b0fd5cc8c4f7f3f644db99ccfb0e5e7d7ce60566fdb27aab472cf71cb1 WatchSource:0}: Error finding container b8a814b0fd5cc8c4f7f3f644db99ccfb0e5e7d7ce60566fdb27aab472cf71cb1: Status 404 returned error can't find the container with id b8a814b0fd5cc8c4f7f3f644db99ccfb0e5e7d7ce60566fdb27aab472cf71cb1 Jan 27 16:43:53 crc kubenswrapper[4772]: I0127 16:43:53.225111 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c028-account-create-update-x9k2v"] Jan 27 16:43:53 crc kubenswrapper[4772]: I0127 16:43:53.587765 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c028-account-create-update-x9k2v" event={"ID":"1cd343bc-4a97-4ad2-aa82-12ea527398d8","Type":"ContainerStarted","Data":"5ed8b1fb179ec96a663871b1030549de3060884fefe55bd4b363bd01934e9e74"} Jan 27 16:43:53 crc kubenswrapper[4772]: I0127 16:43:53.588043 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c028-account-create-update-x9k2v" event={"ID":"1cd343bc-4a97-4ad2-aa82-12ea527398d8","Type":"ContainerStarted","Data":"b8a814b0fd5cc8c4f7f3f644db99ccfb0e5e7d7ce60566fdb27aab472cf71cb1"} Jan 27 16:43:53 crc kubenswrapper[4772]: I0127 16:43:53.604391 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-c028-account-create-update-x9k2v" podStartSLOduration=1.604367918 podStartE2EDuration="1.604367918s" podCreationTimestamp="2026-01-27 16:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:43:53.600711224 +0000 UTC m=+5819.581320342" watchObservedRunningTime="2026-01-27 16:43:53.604367918 +0000 UTC m=+5819.584977016" Jan 27 16:43:53 crc kubenswrapper[4772]: I0127 16:43:53.664925 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:43:53 crc kubenswrapper[4772]: E0127 16:43:53.667741 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:43:53 crc kubenswrapper[4772]: I0127 16:43:53.932835 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.077356 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6985c580-efad-46fe-8e20-9f932ce3af7d-operator-scripts\") pod \"6985c580-efad-46fe-8e20-9f932ce3af7d\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.077608 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ccfn\" (UniqueName: \"kubernetes.io/projected/6985c580-efad-46fe-8e20-9f932ce3af7d-kube-api-access-9ccfn\") pod \"6985c580-efad-46fe-8e20-9f932ce3af7d\" (UID: \"6985c580-efad-46fe-8e20-9f932ce3af7d\") " Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.078678 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6985c580-efad-46fe-8e20-9f932ce3af7d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6985c580-efad-46fe-8e20-9f932ce3af7d" (UID: "6985c580-efad-46fe-8e20-9f932ce3af7d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.082505 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6985c580-efad-46fe-8e20-9f932ce3af7d-kube-api-access-9ccfn" (OuterVolumeSpecName: "kube-api-access-9ccfn") pod "6985c580-efad-46fe-8e20-9f932ce3af7d" (UID: "6985c580-efad-46fe-8e20-9f932ce3af7d"). InnerVolumeSpecName "kube-api-access-9ccfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.180573 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6985c580-efad-46fe-8e20-9f932ce3af7d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.180667 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ccfn\" (UniqueName: \"kubernetes.io/projected/6985c580-efad-46fe-8e20-9f932ce3af7d-kube-api-access-9ccfn\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.606649 4772 generic.go:334] "Generic (PLEG): container finished" podID="1cd343bc-4a97-4ad2-aa82-12ea527398d8" containerID="5ed8b1fb179ec96a663871b1030549de3060884fefe55bd4b363bd01934e9e74" exitCode=0 Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.607108 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c028-account-create-update-x9k2v" event={"ID":"1cd343bc-4a97-4ad2-aa82-12ea527398d8","Type":"ContainerDied","Data":"5ed8b1fb179ec96a663871b1030549de3060884fefe55bd4b363bd01934e9e74"} Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.613661 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-fw49r" event={"ID":"6985c580-efad-46fe-8e20-9f932ce3af7d","Type":"ContainerDied","Data":"c36e42e52f193c7a6be7631f052325519c9df5bcaa5b03bea3f71227fc45caeb"} Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.613708 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c36e42e52f193c7a6be7631f052325519c9df5bcaa5b03bea3f71227fc45caeb" Jan 27 16:43:54 crc kubenswrapper[4772]: I0127 16:43:54.613727 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-fw49r" Jan 27 16:43:55 crc kubenswrapper[4772]: I0127 16:43:55.989208 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.115571 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd343bc-4a97-4ad2-aa82-12ea527398d8-operator-scripts\") pod \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.115754 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt2lh\" (UniqueName: \"kubernetes.io/projected/1cd343bc-4a97-4ad2-aa82-12ea527398d8-kube-api-access-mt2lh\") pod \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\" (UID: \"1cd343bc-4a97-4ad2-aa82-12ea527398d8\") " Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.116027 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cd343bc-4a97-4ad2-aa82-12ea527398d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cd343bc-4a97-4ad2-aa82-12ea527398d8" (UID: "1cd343bc-4a97-4ad2-aa82-12ea527398d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.116160 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cd343bc-4a97-4ad2-aa82-12ea527398d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.121553 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd343bc-4a97-4ad2-aa82-12ea527398d8-kube-api-access-mt2lh" (OuterVolumeSpecName: "kube-api-access-mt2lh") pod "1cd343bc-4a97-4ad2-aa82-12ea527398d8" (UID: "1cd343bc-4a97-4ad2-aa82-12ea527398d8"). InnerVolumeSpecName "kube-api-access-mt2lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.219473 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt2lh\" (UniqueName: \"kubernetes.io/projected/1cd343bc-4a97-4ad2-aa82-12ea527398d8-kube-api-access-mt2lh\") on node \"crc\" DevicePath \"\"" Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.634101 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c028-account-create-update-x9k2v" event={"ID":"1cd343bc-4a97-4ad2-aa82-12ea527398d8","Type":"ContainerDied","Data":"b8a814b0fd5cc8c4f7f3f644db99ccfb0e5e7d7ce60566fdb27aab472cf71cb1"} Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.634440 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a814b0fd5cc8c4f7f3f644db99ccfb0e5e7d7ce60566fdb27aab472cf71cb1" Jan 27 16:43:56 crc kubenswrapper[4772]: I0127 16:43:56.634189 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c028-account-create-update-x9k2v" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.451820 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-4mfnz"] Jan 27 16:43:58 crc kubenswrapper[4772]: E0127 16:43:58.452213 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6985c580-efad-46fe-8e20-9f932ce3af7d" containerName="mariadb-database-create" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.452225 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6985c580-efad-46fe-8e20-9f932ce3af7d" containerName="mariadb-database-create" Jan 27 16:43:58 crc kubenswrapper[4772]: E0127 16:43:58.452257 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd343bc-4a97-4ad2-aa82-12ea527398d8" containerName="mariadb-account-create-update" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.452263 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd343bc-4a97-4ad2-aa82-12ea527398d8" containerName="mariadb-account-create-update" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.452440 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6985c580-efad-46fe-8e20-9f932ce3af7d" containerName="mariadb-database-create" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.452459 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd343bc-4a97-4ad2-aa82-12ea527398d8" containerName="mariadb-account-create-update" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.453046 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.470834 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-4mfnz"] Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.566324 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4r4\" (UniqueName: \"kubernetes.io/projected/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-kube-api-access-xv4r4\") pod \"octavia-persistence-db-create-4mfnz\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.566386 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-operator-scripts\") pod \"octavia-persistence-db-create-4mfnz\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.668541 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4r4\" (UniqueName: \"kubernetes.io/projected/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-kube-api-access-xv4r4\") pod \"octavia-persistence-db-create-4mfnz\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.668595 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-operator-scripts\") pod \"octavia-persistence-db-create-4mfnz\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.670400 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-operator-scripts\") pod \"octavia-persistence-db-create-4mfnz\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.699560 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4r4\" (UniqueName: \"kubernetes.io/projected/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-kube-api-access-xv4r4\") pod \"octavia-persistence-db-create-4mfnz\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.772602 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.934993 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-c1c0-account-create-update-cjqxk"] Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.936867 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.938714 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.946626 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c1c0-account-create-update-cjqxk"] Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.974976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnn2n\" (UniqueName: \"kubernetes.io/projected/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-kube-api-access-qnn2n\") pod \"octavia-c1c0-account-create-update-cjqxk\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:58 crc kubenswrapper[4772]: I0127 16:43:58.975316 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-operator-scripts\") pod \"octavia-c1c0-account-create-update-cjqxk\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.076988 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnn2n\" (UniqueName: \"kubernetes.io/projected/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-kube-api-access-qnn2n\") pod \"octavia-c1c0-account-create-update-cjqxk\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.077177 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-operator-scripts\") pod \"octavia-c1c0-account-create-update-cjqxk\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.077846 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-operator-scripts\") pod \"octavia-c1c0-account-create-update-cjqxk\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.094132 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnn2n\" (UniqueName: \"kubernetes.io/projected/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-kube-api-access-qnn2n\") pod \"octavia-c1c0-account-create-update-cjqxk\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.238795 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-4mfnz"] Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.256462 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.681442 4772 generic.go:334] "Generic (PLEG): container finished" podID="c203bc37-6216-4aa6-8a51-1e0a2f01bb43" containerID="ce5afc895546186c87fa545a3ff11c6c821cf7ba305b10a05525fc458e3f4be7" exitCode=0 Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.681685 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-4mfnz" event={"ID":"c203bc37-6216-4aa6-8a51-1e0a2f01bb43","Type":"ContainerDied","Data":"ce5afc895546186c87fa545a3ff11c6c821cf7ba305b10a05525fc458e3f4be7"} Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.681709 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-4mfnz" event={"ID":"c203bc37-6216-4aa6-8a51-1e0a2f01bb43","Type":"ContainerStarted","Data":"80faa54f23772f274c42b0c93fe478a812eadfb51f692f0e40220291b27183c9"} Jan 27 16:43:59 crc kubenswrapper[4772]: I0127 16:43:59.743986 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-c1c0-account-create-update-cjqxk"] Jan 27 16:43:59 crc kubenswrapper[4772]: W0127 16:43:59.756556 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ff09ddf_04fd_42e9_b6fd_8c9fd9fac0e4.slice/crio-f3e01eab28c27034647a68108d237eb65cb31397f37dce39098e0a9f49ce0695 WatchSource:0}: Error finding container f3e01eab28c27034647a68108d237eb65cb31397f37dce39098e0a9f49ce0695: Status 404 returned error can't find the container with id f3e01eab28c27034647a68108d237eb65cb31397f37dce39098e0a9f49ce0695 Jan 27 16:44:00 crc kubenswrapper[4772]: I0127 16:44:00.693943 4772 generic.go:334] "Generic (PLEG): container finished" podID="7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" containerID="5f81345bb4c2b5ce3e497a908767c4ee00a2cf35d4b6e4a9ea4e1fe6b2891391" exitCode=0 Jan 27 16:44:00 crc kubenswrapper[4772]: I0127 16:44:00.694027 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c1c0-account-create-update-cjqxk" event={"ID":"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4","Type":"ContainerDied","Data":"5f81345bb4c2b5ce3e497a908767c4ee00a2cf35d4b6e4a9ea4e1fe6b2891391"} Jan 27 16:44:00 crc kubenswrapper[4772]: I0127 16:44:00.694063 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c1c0-account-create-update-cjqxk" event={"ID":"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4","Type":"ContainerStarted","Data":"f3e01eab28c27034647a68108d237eb65cb31397f37dce39098e0a9f49ce0695"} Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.055132 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.115845 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4r4\" (UniqueName: \"kubernetes.io/projected/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-kube-api-access-xv4r4\") pod \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.116092 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-operator-scripts\") pod \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\" (UID: \"c203bc37-6216-4aa6-8a51-1e0a2f01bb43\") " Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.117143 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c203bc37-6216-4aa6-8a51-1e0a2f01bb43" (UID: "c203bc37-6216-4aa6-8a51-1e0a2f01bb43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.122813 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-kube-api-access-xv4r4" (OuterVolumeSpecName: "kube-api-access-xv4r4") pod "c203bc37-6216-4aa6-8a51-1e0a2f01bb43" (UID: "c203bc37-6216-4aa6-8a51-1e0a2f01bb43"). InnerVolumeSpecName "kube-api-access-xv4r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.218046 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.218083 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv4r4\" (UniqueName: \"kubernetes.io/projected/c203bc37-6216-4aa6-8a51-1e0a2f01bb43-kube-api-access-xv4r4\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.704367 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-4mfnz" event={"ID":"c203bc37-6216-4aa6-8a51-1e0a2f01bb43","Type":"ContainerDied","Data":"80faa54f23772f274c42b0c93fe478a812eadfb51f692f0e40220291b27183c9"} Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.704441 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80faa54f23772f274c42b0c93fe478a812eadfb51f692f0e40220291b27183c9" Jan 27 16:44:01 crc kubenswrapper[4772]: I0127 16:44:01.704405 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4mfnz" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.086191 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.137467 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-operator-scripts\") pod \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.137858 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnn2n\" (UniqueName: \"kubernetes.io/projected/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-kube-api-access-qnn2n\") pod \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\" (UID: \"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4\") " Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.138601 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" (UID: "7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.143505 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-kube-api-access-qnn2n" (OuterVolumeSpecName: "kube-api-access-qnn2n") pod "7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" (UID: "7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4"). InnerVolumeSpecName "kube-api-access-qnn2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.240161 4772 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.240218 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnn2n\" (UniqueName: \"kubernetes.io/projected/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4-kube-api-access-qnn2n\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.714958 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-c1c0-account-create-update-cjqxk" event={"ID":"7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4","Type":"ContainerDied","Data":"f3e01eab28c27034647a68108d237eb65cb31397f37dce39098e0a9f49ce0695"} Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.714986 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-c1c0-account-create-update-cjqxk" Jan 27 16:44:02 crc kubenswrapper[4772]: I0127 16:44:02.714996 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e01eab28c27034647a68108d237eb65cb31397f37dce39098e0a9f49ce0695" Jan 27 16:44:02 crc kubenswrapper[4772]: E0127 16:44:02.891105 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ff09ddf_04fd_42e9_b6fd_8c9fd9fac0e4.slice\": RecentStats: unable to find data in memory cache]" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.767087 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6d96bf4746-x9c97"] Jan 27 16:44:04 crc kubenswrapper[4772]: E0127 16:44:04.767966 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" containerName="mariadb-account-create-update" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.767983 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" containerName="mariadb-account-create-update" Jan 27 16:44:04 crc kubenswrapper[4772]: E0127 16:44:04.768019 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c203bc37-6216-4aa6-8a51-1e0a2f01bb43" containerName="mariadb-database-create" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.768027 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c203bc37-6216-4aa6-8a51-1e0a2f01bb43" containerName="mariadb-database-create" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.768294 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" containerName="mariadb-account-create-update" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.768337 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c203bc37-6216-4aa6-8a51-1e0a2f01bb43" containerName="mariadb-database-create" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.769944 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.773668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-bqhmr" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.786432 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.786972 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.794443 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-config-data\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.794526 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/9371d269-02b3-4049-aeea-4fd56c648b89-octavia-run\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.794657 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9371d269-02b3-4049-aeea-4fd56c648b89-config-data-merged\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.794691 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-scripts\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.794769 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-combined-ca-bundle\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.822744 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6d96bf4746-x9c97"] Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.896347 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-combined-ca-bundle\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.896452 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-config-data\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.896484 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/9371d269-02b3-4049-aeea-4fd56c648b89-octavia-run\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.896625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9371d269-02b3-4049-aeea-4fd56c648b89-config-data-merged\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.896651 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-scripts\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.897807 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/9371d269-02b3-4049-aeea-4fd56c648b89-octavia-run\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.897884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9371d269-02b3-4049-aeea-4fd56c648b89-config-data-merged\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.902371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-config-data\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.904702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-scripts\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:04 crc kubenswrapper[4772]: I0127 16:44:04.919661 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9371d269-02b3-4049-aeea-4fd56c648b89-combined-ca-bundle\") pod \"octavia-api-6d96bf4746-x9c97\" (UID: \"9371d269-02b3-4049-aeea-4fd56c648b89\") " pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:05 crc kubenswrapper[4772]: I0127 16:44:05.102347 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:05 crc kubenswrapper[4772]: I0127 16:44:05.637400 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:44:05 crc kubenswrapper[4772]: I0127 16:44:05.649405 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6d96bf4746-x9c97"] Jan 27 16:44:05 crc kubenswrapper[4772]: I0127 16:44:05.663221 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:44:05 crc kubenswrapper[4772]: E0127 16:44:05.663507 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:44:05 crc kubenswrapper[4772]: I0127 16:44:05.739252 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6d96bf4746-x9c97" event={"ID":"9371d269-02b3-4049-aeea-4fd56c648b89","Type":"ContainerStarted","Data":"aeee5e34a46468c554b59b9f0711b0d1ef508610bc8952cbbce4d2282ea800f3"} Jan 27 16:44:07 crc kubenswrapper[4772]: I0127 16:44:07.577247 4772 scope.go:117] "RemoveContainer" containerID="6b8b1c4b2ab6a42f7f2fdd3da73f914caa96bb6a3722955fd242601b775105ee" Jan 27 16:44:07 crc kubenswrapper[4772]: I0127 16:44:07.604286 4772 scope.go:117] "RemoveContainer" containerID="5344c5595e4c241d1f3af5be472eb435a5a87e2e74a29b46f8cee0d5b6b1c135" Jan 27 16:44:07 crc kubenswrapper[4772]: I0127 16:44:07.657065 4772 scope.go:117] "RemoveContainer" containerID="058f6eb2f4f5195120a96ef6b4691590bd77114e18b6ed23148cdd3864b329b4" Jan 27 16:44:07 crc kubenswrapper[4772]: I0127 16:44:07.680198 4772 scope.go:117] "RemoveContainer" containerID="761ce723d19214c2204c8e256e0ee21bf82feb31ce0c9f783c4a5819a97623f7" Jan 27 16:44:15 crc kubenswrapper[4772]: I0127 16:44:15.856306 4772 generic.go:334] "Generic (PLEG): container finished" podID="9371d269-02b3-4049-aeea-4fd56c648b89" containerID="43206bad0bf3620a1a3676348a740d052b18a789884d645086fe2a2e3fe3d350" exitCode=0 Jan 27 16:44:15 crc kubenswrapper[4772]: I0127 16:44:15.856471 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6d96bf4746-x9c97" event={"ID":"9371d269-02b3-4049-aeea-4fd56c648b89","Type":"ContainerDied","Data":"43206bad0bf3620a1a3676348a740d052b18a789884d645086fe2a2e3fe3d350"} Jan 27 16:44:16 crc kubenswrapper[4772]: I0127 16:44:16.869838 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6d96bf4746-x9c97" event={"ID":"9371d269-02b3-4049-aeea-4fd56c648b89","Type":"ContainerStarted","Data":"672d529986fa767415faf2a5c7d42386d2e5373e03758b384acd2235c1b4fb8c"} Jan 27 16:44:16 crc kubenswrapper[4772]: I0127 16:44:16.870214 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6d96bf4746-x9c97" event={"ID":"9371d269-02b3-4049-aeea-4fd56c648b89","Type":"ContainerStarted","Data":"70778e96b10191ab7130a4cd6710c7b9b8b55518c8923812b7964a26367367d7"} Jan 27 16:44:16 crc kubenswrapper[4772]: I0127 16:44:16.870483 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:16 crc kubenswrapper[4772]: I0127 16:44:16.892595 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6d96bf4746-x9c97" podStartSLOduration=3.341554981 podStartE2EDuration="12.892574091s" podCreationTimestamp="2026-01-27 16:44:04 +0000 UTC" firstStartedPulling="2026-01-27 16:44:05.637138609 +0000 UTC m=+5831.617747707" lastFinishedPulling="2026-01-27 16:44:15.188157699 +0000 UTC m=+5841.168766817" observedRunningTime="2026-01-27 16:44:16.886379804 +0000 UTC m=+5842.866988902" watchObservedRunningTime="2026-01-27 16:44:16.892574091 +0000 UTC m=+5842.873183189" Jan 27 16:44:17 crc kubenswrapper[4772]: I0127 16:44:17.662642 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:44:17 crc kubenswrapper[4772]: E0127 16:44:17.663538 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:44:17 crc kubenswrapper[4772]: I0127 16:44:17.877343 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.384532 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.388234 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-s5xbr" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.395916 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-jv694" podUID="febb140e-d26e-43db-9924-0f06739b9a4a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 16:44:23 crc kubenswrapper[4772]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 16:44:23 crc kubenswrapper[4772]: > Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.520318 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-jv694-config-dfbjt"] Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.522111 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.524499 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.541010 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jv694-config-dfbjt"] Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.590849 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-scripts\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.590912 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-additional-scripts\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.590953 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run-ovn\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.591034 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6kc\" (UniqueName: \"kubernetes.io/projected/7a86d738-a01c-4772-a150-7cc01a6503eb-kube-api-access-tl6kc\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.591218 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.591244 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-log-ovn\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.692778 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.692831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-log-ovn\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693472 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-scripts\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693533 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-additional-scripts\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693574 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run-ovn\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-log-ovn\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693610 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6kc\" (UniqueName: \"kubernetes.io/projected/7a86d738-a01c-4772-a150-7cc01a6503eb-kube-api-access-tl6kc\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693615 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.693650 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run-ovn\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.694350 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-additional-scripts\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.695594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-scripts\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.721127 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6kc\" (UniqueName: \"kubernetes.io/projected/7a86d738-a01c-4772-a150-7cc01a6503eb-kube-api-access-tl6kc\") pod \"ovn-controller-jv694-config-dfbjt\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:23 crc kubenswrapper[4772]: I0127 16:44:23.867846 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:24 crc kubenswrapper[4772]: I0127 16:44:24.354673 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-jv694-config-dfbjt"] Jan 27 16:44:24 crc kubenswrapper[4772]: I0127 16:44:24.944544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jv694-config-dfbjt" event={"ID":"7a86d738-a01c-4772-a150-7cc01a6503eb","Type":"ContainerStarted","Data":"2c24a009396935ad397dec4435d87a19075c63011ad8f23a534d35cc814f6ddc"} Jan 27 16:44:24 crc kubenswrapper[4772]: I0127 16:44:24.944939 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jv694-config-dfbjt" event={"ID":"7a86d738-a01c-4772-a150-7cc01a6503eb","Type":"ContainerStarted","Data":"475062218bf811acbfff5ec92de82f6c43e05cf2fe5ad2412beabfe854fe2374"} Jan 27 16:44:25 crc kubenswrapper[4772]: I0127 16:44:25.956225 4772 generic.go:334] "Generic (PLEG): container finished" podID="7a86d738-a01c-4772-a150-7cc01a6503eb" containerID="2c24a009396935ad397dec4435d87a19075c63011ad8f23a534d35cc814f6ddc" exitCode=0 Jan 27 16:44:25 crc kubenswrapper[4772]: I0127 16:44:25.956264 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-jv694-config-dfbjt" event={"ID":"7a86d738-a01c-4772-a150-7cc01a6503eb","Type":"ContainerDied","Data":"2c24a009396935ad397dec4435d87a19075c63011ad8f23a534d35cc814f6ddc"} Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.339125 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.470912 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-scripts\") pod \"7a86d738-a01c-4772-a150-7cc01a6503eb\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471115 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run\") pod \"7a86d738-a01c-4772-a150-7cc01a6503eb\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471162 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run" (OuterVolumeSpecName: "var-run") pod "7a86d738-a01c-4772-a150-7cc01a6503eb" (UID: "7a86d738-a01c-4772-a150-7cc01a6503eb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run-ovn\") pod \"7a86d738-a01c-4772-a150-7cc01a6503eb\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471382 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl6kc\" (UniqueName: \"kubernetes.io/projected/7a86d738-a01c-4772-a150-7cc01a6503eb-kube-api-access-tl6kc\") pod \"7a86d738-a01c-4772-a150-7cc01a6503eb\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471405 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-log-ovn\") pod \"7a86d738-a01c-4772-a150-7cc01a6503eb\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471397 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7a86d738-a01c-4772-a150-7cc01a6503eb" (UID: "7a86d738-a01c-4772-a150-7cc01a6503eb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471518 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-additional-scripts\") pod \"7a86d738-a01c-4772-a150-7cc01a6503eb\" (UID: \"7a86d738-a01c-4772-a150-7cc01a6503eb\") " Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.471540 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7a86d738-a01c-4772-a150-7cc01a6503eb" (UID: "7a86d738-a01c-4772-a150-7cc01a6503eb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.472128 4772 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.472152 4772 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.472177 4772 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7a86d738-a01c-4772-a150-7cc01a6503eb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.472206 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7a86d738-a01c-4772-a150-7cc01a6503eb" (UID: "7a86d738-a01c-4772-a150-7cc01a6503eb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.472233 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-scripts" (OuterVolumeSpecName: "scripts") pod "7a86d738-a01c-4772-a150-7cc01a6503eb" (UID: "7a86d738-a01c-4772-a150-7cc01a6503eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.485449 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a86d738-a01c-4772-a150-7cc01a6503eb-kube-api-access-tl6kc" (OuterVolumeSpecName: "kube-api-access-tl6kc") pod "7a86d738-a01c-4772-a150-7cc01a6503eb" (UID: "7a86d738-a01c-4772-a150-7cc01a6503eb"). InnerVolumeSpecName "kube-api-access-tl6kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.574018 4772 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.574055 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7a86d738-a01c-4772-a150-7cc01a6503eb-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.574067 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl6kc\" (UniqueName: \"kubernetes.io/projected/7a86d738-a01c-4772-a150-7cc01a6503eb-kube-api-access-tl6kc\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.765658 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-jv694-config-dfbjt"] Jan 27 16:44:27 crc kubenswrapper[4772]: I0127 16:44:27.774373 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-jv694-config-dfbjt"] Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.015027 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475062218bf811acbfff5ec92de82f6c43e05cf2fe5ad2412beabfe854fe2374" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.015139 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-jv694-config-dfbjt" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.240380 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-4wcv5"] Jan 27 16:44:28 crc kubenswrapper[4772]: E0127 16:44:28.240764 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a86d738-a01c-4772-a150-7cc01a6503eb" containerName="ovn-config" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.240784 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a86d738-a01c-4772-a150-7cc01a6503eb" containerName="ovn-config" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.241000 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a86d738-a01c-4772-a150-7cc01a6503eb" containerName="ovn-config" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.242010 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.243593 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.244024 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.244091 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.265717 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-4wcv5"] Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.385117 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-jv694" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.389248 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c163e8de-ea19-4a1c-8791-8659b9a09ba3-config-data-merged\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.389309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c163e8de-ea19-4a1c-8791-8659b9a09ba3-hm-ports\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.389355 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c163e8de-ea19-4a1c-8791-8659b9a09ba3-config-data\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.389378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c163e8de-ea19-4a1c-8791-8659b9a09ba3-scripts\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.491098 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c163e8de-ea19-4a1c-8791-8659b9a09ba3-config-data-merged\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.491290 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c163e8de-ea19-4a1c-8791-8659b9a09ba3-hm-ports\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.491405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c163e8de-ea19-4a1c-8791-8659b9a09ba3-config-data\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.491612 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c163e8de-ea19-4a1c-8791-8659b9a09ba3-config-data-merged\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.492040 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c163e8de-ea19-4a1c-8791-8659b9a09ba3-hm-ports\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.492429 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c163e8de-ea19-4a1c-8791-8659b9a09ba3-scripts\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.496504 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c163e8de-ea19-4a1c-8791-8659b9a09ba3-config-data\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.496717 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c163e8de-ea19-4a1c-8791-8659b9a09ba3-scripts\") pod \"octavia-rsyslog-4wcv5\" (UID: \"c163e8de-ea19-4a1c-8791-8659b9a09ba3\") " pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.561276 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.682439 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a86d738-a01c-4772-a150-7cc01a6503eb" path="/var/lib/kubelet/pods/7a86d738-a01c-4772-a150-7cc01a6503eb/volumes" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.904907 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-wwbn2"] Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.906667 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.916475 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-wwbn2"] Jan 27 16:44:28 crc kubenswrapper[4772]: I0127 16:44:28.957276 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.019624 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-httpd-config\") pod \"octavia-image-upload-59f8cff499-wwbn2\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.019779 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-amphora-image\") pod \"octavia-image-upload-59f8cff499-wwbn2\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.120698 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-httpd-config\") pod \"octavia-image-upload-59f8cff499-wwbn2\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.120941 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-amphora-image\") pod \"octavia-image-upload-59f8cff499-wwbn2\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.121552 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-amphora-image\") pod \"octavia-image-upload-59f8cff499-wwbn2\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.131963 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-httpd-config\") pod \"octavia-image-upload-59f8cff499-wwbn2\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.145605 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-4wcv5"] Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.277444 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.317870 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-4wcv5"] Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.619741 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-bz6q7"] Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.621959 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.624764 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.643206 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bz6q7"] Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.663615 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:44:29 crc kubenswrapper[4772]: E0127 16:44:29.663921 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.734392 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data-merged\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.734923 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-scripts\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.734981 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.735107 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-combined-ca-bundle\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.748924 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-wwbn2"] Jan 27 16:44:29 crc kubenswrapper[4772]: W0127 16:44:29.760500 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79cf41b1_8613_4f3f_8e1e_35d0ca7efffe.slice/crio-4fb27a69e55f5ad48bd9d1e4c9841ecd324aac0beb1fbf6032053c07c57d076f WatchSource:0}: Error finding container 4fb27a69e55f5ad48bd9d1e4c9841ecd324aac0beb1fbf6032053c07c57d076f: Status 404 returned error can't find the container with id 4fb27a69e55f5ad48bd9d1e4c9841ecd324aac0beb1fbf6032053c07c57d076f Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.837088 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-combined-ca-bundle\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.837190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data-merged\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.837809 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data-merged\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.839060 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-scripts\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.839107 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.843392 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-combined-ca-bundle\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.843533 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-scripts\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.852227 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data\") pod \"octavia-db-sync-bz6q7\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:29 crc kubenswrapper[4772]: I0127 16:44:29.944031 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:30 crc kubenswrapper[4772]: I0127 16:44:30.078083 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" event={"ID":"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe","Type":"ContainerStarted","Data":"4fb27a69e55f5ad48bd9d1e4c9841ecd324aac0beb1fbf6032053c07c57d076f"} Jan 27 16:44:30 crc kubenswrapper[4772]: I0127 16:44:30.085914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4wcv5" event={"ID":"c163e8de-ea19-4a1c-8791-8659b9a09ba3","Type":"ContainerStarted","Data":"e08af4172eecfe384e6bdb1a3062be61c027c3f7eb2af34149f6166aeea022d4"} Jan 27 16:44:30 crc kubenswrapper[4772]: I0127 16:44:30.417381 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bz6q7"] Jan 27 16:44:30 crc kubenswrapper[4772]: W0127 16:44:30.424241 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1152dfc9_a3d1_41a5_92cb_a5a8c481a8fa.slice/crio-1819e653d8ad3c01c8686d552abe73590b799f2ef81d0815681adfb79d2743ba WatchSource:0}: Error finding container 1819e653d8ad3c01c8686d552abe73590b799f2ef81d0815681adfb79d2743ba: Status 404 returned error can't find the container with id 1819e653d8ad3c01c8686d552abe73590b799f2ef81d0815681adfb79d2743ba Jan 27 16:44:31 crc kubenswrapper[4772]: I0127 16:44:31.096029 4772 generic.go:334] "Generic (PLEG): container finished" podID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerID="908bfdb73b8d3ebb90afef686e407a3a2da1f2ea295c24e132068119ed919b42" exitCode=0 Jan 27 16:44:31 crc kubenswrapper[4772]: I0127 16:44:31.096075 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bz6q7" event={"ID":"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa","Type":"ContainerDied","Data":"908bfdb73b8d3ebb90afef686e407a3a2da1f2ea295c24e132068119ed919b42"} Jan 27 16:44:31 crc kubenswrapper[4772]: I0127 16:44:31.096392 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bz6q7" event={"ID":"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa","Type":"ContainerStarted","Data":"1819e653d8ad3c01c8686d552abe73590b799f2ef81d0815681adfb79d2743ba"} Jan 27 16:44:34 crc kubenswrapper[4772]: I0127 16:44:34.130145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4wcv5" event={"ID":"c163e8de-ea19-4a1c-8791-8659b9a09ba3","Type":"ContainerStarted","Data":"7a55c8b7d3f5ef107842b8268aaea45baea4ce51ff9790efdd6cb25f55809a83"} Jan 27 16:44:34 crc kubenswrapper[4772]: I0127 16:44:34.134363 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bz6q7" event={"ID":"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa","Type":"ContainerStarted","Data":"3c5cbb7ef3f7daca21bab77efa8022a5214525eb8b99dd2483d6689f1db4cb83"} Jan 27 16:44:34 crc kubenswrapper[4772]: I0127 16:44:34.165928 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-bz6q7" podStartSLOduration=5.165907717 podStartE2EDuration="5.165907717s" podCreationTimestamp="2026-01-27 16:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:44:34.162985484 +0000 UTC m=+5860.143594582" watchObservedRunningTime="2026-01-27 16:44:34.165907717 +0000 UTC m=+5860.146516815" Jan 27 16:44:36 crc kubenswrapper[4772]: I0127 16:44:36.152078 4772 generic.go:334] "Generic (PLEG): container finished" podID="c163e8de-ea19-4a1c-8791-8659b9a09ba3" containerID="7a55c8b7d3f5ef107842b8268aaea45baea4ce51ff9790efdd6cb25f55809a83" exitCode=0 Jan 27 16:44:36 crc kubenswrapper[4772]: I0127 16:44:36.152182 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4wcv5" event={"ID":"c163e8de-ea19-4a1c-8791-8659b9a09ba3","Type":"ContainerDied","Data":"7a55c8b7d3f5ef107842b8268aaea45baea4ce51ff9790efdd6cb25f55809a83"} Jan 27 16:44:38 crc kubenswrapper[4772]: I0127 16:44:38.209596 4772 generic.go:334] "Generic (PLEG): container finished" podID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerID="3c5cbb7ef3f7daca21bab77efa8022a5214525eb8b99dd2483d6689f1db4cb83" exitCode=0 Jan 27 16:44:38 crc kubenswrapper[4772]: I0127 16:44:38.209674 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bz6q7" event={"ID":"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa","Type":"ContainerDied","Data":"3c5cbb7ef3f7daca21bab77efa8022a5214525eb8b99dd2483d6689f1db4cb83"} Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.245540 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.492834 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6d96bf4746-x9c97" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.643372 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.752775 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data\") pod \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.753428 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-scripts\") pod \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.753544 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-combined-ca-bundle\") pod \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.753691 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data-merged\") pod \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\" (UID: \"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa\") " Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.760854 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-scripts" (OuterVolumeSpecName: "scripts") pod "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" (UID: "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.764295 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data" (OuterVolumeSpecName: "config-data") pod "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" (UID: "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.789692 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" (UID: "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.791036 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" (UID: "1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.855355 4772 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data-merged\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.855380 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.855390 4772 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:39 crc kubenswrapper[4772]: I0127 16:44:39.855400 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.232047 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-4wcv5" event={"ID":"c163e8de-ea19-4a1c-8791-8659b9a09ba3","Type":"ContainerStarted","Data":"33227eee176068edc8a2071687aaea5f2a55907beefcfe1533e576e25e787ca5"} Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.232898 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.234304 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bz6q7" Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.234292 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bz6q7" event={"ID":"1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa","Type":"ContainerDied","Data":"1819e653d8ad3c01c8686d552abe73590b799f2ef81d0815681adfb79d2743ba"} Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.234434 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1819e653d8ad3c01c8686d552abe73590b799f2ef81d0815681adfb79d2743ba" Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.236203 4772 generic.go:334] "Generic (PLEG): container finished" podID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerID="68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e" exitCode=0 Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.236253 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" event={"ID":"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe","Type":"ContainerDied","Data":"68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e"} Jan 27 16:44:40 crc kubenswrapper[4772]: I0127 16:44:40.253960 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-4wcv5" podStartSLOduration=2.16546444 podStartE2EDuration="12.253932859s" podCreationTimestamp="2026-01-27 16:44:28 +0000 UTC" firstStartedPulling="2026-01-27 16:44:29.155210433 +0000 UTC m=+5855.135819531" lastFinishedPulling="2026-01-27 16:44:39.243678852 +0000 UTC m=+5865.224287950" observedRunningTime="2026-01-27 16:44:40.253798315 +0000 UTC m=+5866.234407423" watchObservedRunningTime="2026-01-27 16:44:40.253932859 +0000 UTC m=+5866.234541967" Jan 27 16:44:42 crc kubenswrapper[4772]: I0127 16:44:42.259447 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" event={"ID":"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe","Type":"ContainerStarted","Data":"be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9"} Jan 27 16:44:42 crc kubenswrapper[4772]: I0127 16:44:42.297222 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" podStartSLOduration=2.814339384 podStartE2EDuration="14.297157088s" podCreationTimestamp="2026-01-27 16:44:28 +0000 UTC" firstStartedPulling="2026-01-27 16:44:29.763413509 +0000 UTC m=+5855.744022617" lastFinishedPulling="2026-01-27 16:44:41.246231233 +0000 UTC m=+5867.226840321" observedRunningTime="2026-01-27 16:44:42.281932464 +0000 UTC m=+5868.262541632" watchObservedRunningTime="2026-01-27 16:44:42.297157088 +0000 UTC m=+5868.277766216" Jan 27 16:44:42 crc kubenswrapper[4772]: I0127 16:44:42.663426 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:44:42 crc kubenswrapper[4772]: E0127 16:44:42.663806 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:44:55 crc kubenswrapper[4772]: I0127 16:44:55.662977 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:44:55 crc kubenswrapper[4772]: E0127 16:44:55.663845 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:44:58 crc kubenswrapper[4772]: I0127 16:44:58.595889 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-4wcv5" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.169113 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l"] Jan 27 16:45:00 crc kubenswrapper[4772]: E0127 16:45:00.169864 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerName="init" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.169883 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerName="init" Jan 27 16:45:00 crc kubenswrapper[4772]: E0127 16:45:00.169898 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerName="octavia-db-sync" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.169906 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerName="octavia-db-sync" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.170204 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" containerName="octavia-db-sync" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.170982 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.173235 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.173820 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.179531 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l"] Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.283194 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnlq\" (UniqueName: \"kubernetes.io/projected/eb399a66-1690-4026-9b2e-9e399d3270d2-kube-api-access-8bnlq\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.283419 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb399a66-1690-4026-9b2e-9e399d3270d2-secret-volume\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.283492 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb399a66-1690-4026-9b2e-9e399d3270d2-config-volume\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.385699 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb399a66-1690-4026-9b2e-9e399d3270d2-secret-volume\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.385785 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb399a66-1690-4026-9b2e-9e399d3270d2-config-volume\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.385823 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnlq\" (UniqueName: \"kubernetes.io/projected/eb399a66-1690-4026-9b2e-9e399d3270d2-kube-api-access-8bnlq\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.386962 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb399a66-1690-4026-9b2e-9e399d3270d2-config-volume\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.393033 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb399a66-1690-4026-9b2e-9e399d3270d2-secret-volume\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.403309 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnlq\" (UniqueName: \"kubernetes.io/projected/eb399a66-1690-4026-9b2e-9e399d3270d2-kube-api-access-8bnlq\") pod \"collect-profiles-29492205-9hv9l\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.495346 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:00 crc kubenswrapper[4772]: I0127 16:45:00.986558 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l"] Jan 27 16:45:01 crc kubenswrapper[4772]: I0127 16:45:01.470330 4772 generic.go:334] "Generic (PLEG): container finished" podID="eb399a66-1690-4026-9b2e-9e399d3270d2" containerID="3b5c3dfd99ca4b5982c3131c3d5ce465e41cbe4ff9774e156e9f425715410ede" exitCode=0 Jan 27 16:45:01 crc kubenswrapper[4772]: I0127 16:45:01.470389 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" event={"ID":"eb399a66-1690-4026-9b2e-9e399d3270d2","Type":"ContainerDied","Data":"3b5c3dfd99ca4b5982c3131c3d5ce465e41cbe4ff9774e156e9f425715410ede"} Jan 27 16:45:01 crc kubenswrapper[4772]: I0127 16:45:01.470646 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" event={"ID":"eb399a66-1690-4026-9b2e-9e399d3270d2","Type":"ContainerStarted","Data":"026979e7ba50a908b6032a92cd91e7808fad026d1d1ebe11a08fe7589b3b8a75"} Jan 27 16:45:02 crc kubenswrapper[4772]: I0127 16:45:02.915392 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.037642 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb399a66-1690-4026-9b2e-9e399d3270d2-config-volume\") pod \"eb399a66-1690-4026-9b2e-9e399d3270d2\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.037723 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnlq\" (UniqueName: \"kubernetes.io/projected/eb399a66-1690-4026-9b2e-9e399d3270d2-kube-api-access-8bnlq\") pod \"eb399a66-1690-4026-9b2e-9e399d3270d2\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.037831 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb399a66-1690-4026-9b2e-9e399d3270d2-secret-volume\") pod \"eb399a66-1690-4026-9b2e-9e399d3270d2\" (UID: \"eb399a66-1690-4026-9b2e-9e399d3270d2\") " Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.038219 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb399a66-1690-4026-9b2e-9e399d3270d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb399a66-1690-4026-9b2e-9e399d3270d2" (UID: "eb399a66-1690-4026-9b2e-9e399d3270d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.038694 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb399a66-1690-4026-9b2e-9e399d3270d2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.043533 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb399a66-1690-4026-9b2e-9e399d3270d2-kube-api-access-8bnlq" (OuterVolumeSpecName: "kube-api-access-8bnlq") pod "eb399a66-1690-4026-9b2e-9e399d3270d2" (UID: "eb399a66-1690-4026-9b2e-9e399d3270d2"). InnerVolumeSpecName "kube-api-access-8bnlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.043862 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb399a66-1690-4026-9b2e-9e399d3270d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb399a66-1690-4026-9b2e-9e399d3270d2" (UID: "eb399a66-1690-4026-9b2e-9e399d3270d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.141868 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnlq\" (UniqueName: \"kubernetes.io/projected/eb399a66-1690-4026-9b2e-9e399d3270d2-kube-api-access-8bnlq\") on node \"crc\" DevicePath \"\"" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.142328 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb399a66-1690-4026-9b2e-9e399d3270d2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.487773 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" event={"ID":"eb399a66-1690-4026-9b2e-9e399d3270d2","Type":"ContainerDied","Data":"026979e7ba50a908b6032a92cd91e7808fad026d1d1ebe11a08fe7589b3b8a75"} Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.487813 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026979e7ba50a908b6032a92cd91e7808fad026d1d1ebe11a08fe7589b3b8a75" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.487865 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l" Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.684246 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-wwbn2"] Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.684806 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerName="octavia-amphora-httpd" containerID="cri-o://be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9" gracePeriod=30 Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.988763 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr"] Jan 27 16:45:03 crc kubenswrapper[4772]: I0127 16:45:03.992470 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-4mmgr"] Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.199529 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.364479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-amphora-image\") pod \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.364670 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-httpd-config\") pod \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\" (UID: \"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe\") " Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.392412 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" (UID: "79cf41b1-8613-4f3f-8e1e-35d0ca7efffe"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.466954 4772 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.470033 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" (UID: "79cf41b1-8613-4f3f-8e1e-35d0ca7efffe"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.498961 4772 generic.go:334] "Generic (PLEG): container finished" podID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerID="be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9" exitCode=0 Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.499005 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" event={"ID":"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe","Type":"ContainerDied","Data":"be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9"} Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.499032 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" event={"ID":"79cf41b1-8613-4f3f-8e1e-35d0ca7efffe","Type":"ContainerDied","Data":"4fb27a69e55f5ad48bd9d1e4c9841ecd324aac0beb1fbf6032053c07c57d076f"} Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.499064 4772 scope.go:117] "RemoveContainer" containerID="be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.499222 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-wwbn2" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.533609 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-wwbn2"] Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.538002 4772 scope.go:117] "RemoveContainer" containerID="68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.541704 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-wwbn2"] Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.556052 4772 scope.go:117] "RemoveContainer" containerID="be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9" Jan 27 16:45:04 crc kubenswrapper[4772]: E0127 16:45:04.556691 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9\": container with ID starting with be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9 not found: ID does not exist" containerID="be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.556743 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9"} err="failed to get container status \"be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9\": rpc error: code = NotFound desc = could not find container \"be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9\": container with ID starting with be7964b45b1fe07afd2ed607cc55d8b0c5ba7716904f6e9df312a541c0d3a1a9 not found: ID does not exist" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.556772 4772 scope.go:117] "RemoveContainer" containerID="68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e" Jan 27 16:45:04 crc kubenswrapper[4772]: E0127 16:45:04.557142 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e\": container with ID starting with 68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e not found: ID does not exist" containerID="68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.557223 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e"} err="failed to get container status \"68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e\": rpc error: code = NotFound desc = could not find container \"68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e\": container with ID starting with 68f623ccc11f688238dc60d96b2aa3b6231be01063d7a945260daa5b84ec5a0e not found: ID does not exist" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.568943 4772 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe-amphora-image\") on node \"crc\" DevicePath \"\"" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.675457 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" path="/var/lib/kubelet/pods/79cf41b1-8613-4f3f-8e1e-35d0ca7efffe/volumes" Jan 27 16:45:04 crc kubenswrapper[4772]: I0127 16:45:04.676472 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4afc52-82aa-4768-9cc8-5e9236fc4330" path="/var/lib/kubelet/pods/9b4afc52-82aa-4768-9cc8-5e9236fc4330/volumes" Jan 27 16:45:07 crc kubenswrapper[4772]: I0127 16:45:07.834396 4772 scope.go:117] "RemoveContainer" containerID="8dd8add741d2c060daa432ff2f192c7a04c82b2eab3360197f22c851ca7bd6c0" Jan 27 16:45:09 crc kubenswrapper[4772]: I0127 16:45:09.663604 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:45:09 crc kubenswrapper[4772]: E0127 16:45:09.665158 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:45:22 crc kubenswrapper[4772]: I0127 16:45:22.663265 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:45:22 crc kubenswrapper[4772]: E0127 16:45:22.664409 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.930035 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-jmckq"] Jan 27 16:45:32 crc kubenswrapper[4772]: E0127 16:45:32.931195 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb399a66-1690-4026-9b2e-9e399d3270d2" containerName="collect-profiles" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.931219 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb399a66-1690-4026-9b2e-9e399d3270d2" containerName="collect-profiles" Jan 27 16:45:32 crc kubenswrapper[4772]: E0127 16:45:32.931237 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerName="init" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.931245 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerName="init" Jan 27 16:45:32 crc kubenswrapper[4772]: E0127 16:45:32.931260 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerName="octavia-amphora-httpd" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.931268 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerName="octavia-amphora-httpd" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.931506 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb399a66-1690-4026-9b2e-9e399d3270d2" containerName="collect-profiles" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.931530 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="79cf41b1-8613-4f3f-8e1e-35d0ca7efffe" containerName="octavia-amphora-httpd" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.932909 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.936358 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.936973 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.937060 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Jan 27 16:45:32 crc kubenswrapper[4772]: I0127 16:45:32.962653 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jmckq"] Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.038051 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-combined-ca-bundle\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.038141 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-amphora-certs\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.038196 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-config-data\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.038222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-scripts\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.038381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d32d3e24-6f03-46cc-b7ae-61383778b183-config-data-merged\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.038414 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d32d3e24-6f03-46cc-b7ae-61383778b183-hm-ports\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.140419 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-scripts\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.140571 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d32d3e24-6f03-46cc-b7ae-61383778b183-config-data-merged\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.140604 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d32d3e24-6f03-46cc-b7ae-61383778b183-hm-ports\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.140628 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-combined-ca-bundle\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.140684 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-amphora-certs\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.140712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-config-data\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.141068 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d32d3e24-6f03-46cc-b7ae-61383778b183-config-data-merged\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.141904 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d32d3e24-6f03-46cc-b7ae-61383778b183-hm-ports\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.146534 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-scripts\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.147063 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-combined-ca-bundle\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.147102 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-config-data\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.147589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d32d3e24-6f03-46cc-b7ae-61383778b183-amphora-certs\") pod \"octavia-healthmanager-jmckq\" (UID: \"d32d3e24-6f03-46cc-b7ae-61383778b183\") " pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.265885 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:33 crc kubenswrapper[4772]: I0127 16:45:33.801520 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jmckq"] Jan 27 16:45:33 crc kubenswrapper[4772]: W0127 16:45:33.803243 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd32d3e24_6f03_46cc_b7ae_61383778b183.slice/crio-a0320d675c0ddf4fbe99af5532cb287c52a9f435573808fcb1d0149eac42ba29 WatchSource:0}: Error finding container a0320d675c0ddf4fbe99af5532cb287c52a9f435573808fcb1d0149eac42ba29: Status 404 returned error can't find the container with id a0320d675c0ddf4fbe99af5532cb287c52a9f435573808fcb1d0149eac42ba29 Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.425940 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-6v7mr"] Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.430742 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.433125 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.433142 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.441048 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-6v7mr"] Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.467136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-config-data\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.467226 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b6f78da3-da1c-4e27-ab65-581c656f74d9-hm-ports\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.467263 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-combined-ca-bundle\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.467344 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b6f78da3-da1c-4e27-ab65-581c656f74d9-config-data-merged\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.467388 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-amphora-certs\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.467457 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-scripts\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.569080 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-scripts\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.569238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-config-data\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.569280 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b6f78da3-da1c-4e27-ab65-581c656f74d9-hm-ports\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.569321 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-combined-ca-bundle\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.569382 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b6f78da3-da1c-4e27-ab65-581c656f74d9-config-data-merged\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.569408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-amphora-certs\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.571202 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/b6f78da3-da1c-4e27-ab65-581c656f74d9-hm-ports\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.571526 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b6f78da3-da1c-4e27-ab65-581c656f74d9-config-data-merged\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.575688 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-config-data\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.575919 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-scripts\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.576774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-amphora-certs\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.577048 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f78da3-da1c-4e27-ab65-581c656f74d9-combined-ca-bundle\") pod \"octavia-housekeeping-6v7mr\" (UID: \"b6f78da3-da1c-4e27-ab65-581c656f74d9\") " pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.761709 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.785892 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jmckq" event={"ID":"d32d3e24-6f03-46cc-b7ae-61383778b183","Type":"ContainerStarted","Data":"b085960b93f23fd7b1cbd5e3d2ceff5f975cbe459c57a4692b506d5fc2410bb8"} Jan 27 16:45:34 crc kubenswrapper[4772]: I0127 16:45:34.785939 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jmckq" event={"ID":"d32d3e24-6f03-46cc-b7ae-61383778b183","Type":"ContainerStarted","Data":"a0320d675c0ddf4fbe99af5532cb287c52a9f435573808fcb1d0149eac42ba29"} Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.303280 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-6v7mr"] Jan 27 16:45:35 crc kubenswrapper[4772]: W0127 16:45:35.309626 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6f78da3_da1c_4e27_ab65_581c656f74d9.slice/crio-e42e3263b76dbd91f396be8169e112da836e1ef6a187e6a6692dbf394b121366 WatchSource:0}: Error finding container e42e3263b76dbd91f396be8169e112da836e1ef6a187e6a6692dbf394b121366: Status 404 returned error can't find the container with id e42e3263b76dbd91f396be8169e112da836e1ef6a187e6a6692dbf394b121366 Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.796196 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-6v7mr" event={"ID":"b6f78da3-da1c-4e27-ab65-581c656f74d9","Type":"ContainerStarted","Data":"e42e3263b76dbd91f396be8169e112da836e1ef6a187e6a6692dbf394b121366"} Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.932077 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-4q859"] Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.935573 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-4q859" Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.938769 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.940485 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Jan 27 16:45:35 crc kubenswrapper[4772]: I0127 16:45:35.942427 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-4q859"] Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.009455 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-config-data\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.009509 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-scripts\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.010266 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ab431622-b724-4ed4-be2b-67ec8b5956db-hm-ports\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.010495 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-amphora-certs\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.010627 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab431622-b724-4ed4-be2b-67ec8b5956db-config-data-merged\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.010686 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-combined-ca-bundle\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.112114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ab431622-b724-4ed4-be2b-67ec8b5956db-hm-ports\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.112245 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-amphora-certs\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.112311 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab431622-b724-4ed4-be2b-67ec8b5956db-config-data-merged\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.112346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-combined-ca-bundle\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.112401 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-config-data\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.112435 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-scripts\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.113269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ab431622-b724-4ed4-be2b-67ec8b5956db-hm-ports\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.113278 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ab431622-b724-4ed4-be2b-67ec8b5956db-config-data-merged\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.118736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-combined-ca-bundle\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.118832 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-amphora-certs\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.119225 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-config-data\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.138565 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab431622-b724-4ed4-be2b-67ec8b5956db-scripts\") pod \"octavia-worker-4q859\" (UID: \"ab431622-b724-4ed4-be2b-67ec8b5956db\") " pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.273621 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-4q859" Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.809012 4772 generic.go:334] "Generic (PLEG): container finished" podID="d32d3e24-6f03-46cc-b7ae-61383778b183" containerID="b085960b93f23fd7b1cbd5e3d2ceff5f975cbe459c57a4692b506d5fc2410bb8" exitCode=0 Jan 27 16:45:36 crc kubenswrapper[4772]: I0127 16:45:36.809286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jmckq" event={"ID":"d32d3e24-6f03-46cc-b7ae-61383778b183","Type":"ContainerDied","Data":"b085960b93f23fd7b1cbd5e3d2ceff5f975cbe459c57a4692b506d5fc2410bb8"} Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.206681 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-4q859"] Jan 27 16:45:37 crc kubenswrapper[4772]: W0127 16:45:37.213408 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab431622_b724_4ed4_be2b_67ec8b5956db.slice/crio-79c2b30249c61672ff175f58e10d97ce72283314333870eb78ed30e834d9566c WatchSource:0}: Error finding container 79c2b30249c61672ff175f58e10d97ce72283314333870eb78ed30e834d9566c: Status 404 returned error can't find the container with id 79c2b30249c61672ff175f58e10d97ce72283314333870eb78ed30e834d9566c Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.503018 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-jmckq"] Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.662758 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:45:37 crc kubenswrapper[4772]: E0127 16:45:37.663017 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.823682 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-jmckq" event={"ID":"d32d3e24-6f03-46cc-b7ae-61383778b183","Type":"ContainerStarted","Data":"753e40c780a5346ae3fdfa2d1a721e15c7d2c633fabd41fa91babb4e476992e8"} Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.823839 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.825452 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-6v7mr" event={"ID":"b6f78da3-da1c-4e27-ab65-581c656f74d9","Type":"ContainerStarted","Data":"35bfac577b0ff976d875ae38ad40b268daada422e60612c1b77a9bf132cb18ba"} Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.826594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-4q859" event={"ID":"ab431622-b724-4ed4-be2b-67ec8b5956db","Type":"ContainerStarted","Data":"79c2b30249c61672ff175f58e10d97ce72283314333870eb78ed30e834d9566c"} Jan 27 16:45:37 crc kubenswrapper[4772]: I0127 16:45:37.859941 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-jmckq" podStartSLOduration=5.8599232610000005 podStartE2EDuration="5.859923261s" podCreationTimestamp="2026-01-27 16:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 16:45:37.852707626 +0000 UTC m=+5923.833316734" watchObservedRunningTime="2026-01-27 16:45:37.859923261 +0000 UTC m=+5923.840532359" Jan 27 16:45:38 crc kubenswrapper[4772]: I0127 16:45:38.837395 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6f78da3-da1c-4e27-ab65-581c656f74d9" containerID="35bfac577b0ff976d875ae38ad40b268daada422e60612c1b77a9bf132cb18ba" exitCode=0 Jan 27 16:45:38 crc kubenswrapper[4772]: I0127 16:45:38.837512 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-6v7mr" event={"ID":"b6f78da3-da1c-4e27-ab65-581c656f74d9","Type":"ContainerDied","Data":"35bfac577b0ff976d875ae38ad40b268daada422e60612c1b77a9bf132cb18ba"} Jan 27 16:45:39 crc kubenswrapper[4772]: I0127 16:45:39.852096 4772 generic.go:334] "Generic (PLEG): container finished" podID="ab431622-b724-4ed4-be2b-67ec8b5956db" containerID="66a2fc4b699fdb30694179059bd5f5442f33dad785be85640528cad608c6c9a6" exitCode=0 Jan 27 16:45:39 crc kubenswrapper[4772]: I0127 16:45:39.852152 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-4q859" event={"ID":"ab431622-b724-4ed4-be2b-67ec8b5956db","Type":"ContainerDied","Data":"66a2fc4b699fdb30694179059bd5f5442f33dad785be85640528cad608c6c9a6"} Jan 27 16:45:39 crc kubenswrapper[4772]: I0127 16:45:39.859575 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-6v7mr" event={"ID":"b6f78da3-da1c-4e27-ab65-581c656f74d9","Type":"ContainerStarted","Data":"612c176fae5b04d8bbe36a3dfabc9d240cc5176477b5b9f0f904aef817a2b4a6"} Jan 27 16:45:39 crc kubenswrapper[4772]: I0127 16:45:39.860989 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:39 crc kubenswrapper[4772]: I0127 16:45:39.970245 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-6v7mr" podStartSLOduration=4.634420626 podStartE2EDuration="5.970220262s" podCreationTimestamp="2026-01-27 16:45:34 +0000 UTC" firstStartedPulling="2026-01-27 16:45:35.317893313 +0000 UTC m=+5921.298502411" lastFinishedPulling="2026-01-27 16:45:36.653692949 +0000 UTC m=+5922.634302047" observedRunningTime="2026-01-27 16:45:39.931053386 +0000 UTC m=+5925.911662514" watchObservedRunningTime="2026-01-27 16:45:39.970220262 +0000 UTC m=+5925.950829360" Jan 27 16:45:40 crc kubenswrapper[4772]: I0127 16:45:40.870833 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-4q859" event={"ID":"ab431622-b724-4ed4-be2b-67ec8b5956db","Type":"ContainerStarted","Data":"551847f80254beee211eac32b0f14ff96dfe201400eed721c5c9fbf12bef6d1c"} Jan 27 16:45:40 crc kubenswrapper[4772]: I0127 16:45:40.871471 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-4q859" Jan 27 16:45:40 crc kubenswrapper[4772]: I0127 16:45:40.900539 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-4q859" podStartSLOduration=4.712181837 podStartE2EDuration="5.9005165s" podCreationTimestamp="2026-01-27 16:45:35 +0000 UTC" firstStartedPulling="2026-01-27 16:45:37.215738379 +0000 UTC m=+5923.196347477" lastFinishedPulling="2026-01-27 16:45:38.404073032 +0000 UTC m=+5924.384682140" observedRunningTime="2026-01-27 16:45:40.888340833 +0000 UTC m=+5926.868949971" watchObservedRunningTime="2026-01-27 16:45:40.9005165 +0000 UTC m=+5926.881125608" Jan 27 16:45:48 crc kubenswrapper[4772]: I0127 16:45:48.302972 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-jmckq" Jan 27 16:45:49 crc kubenswrapper[4772]: I0127 16:45:49.804062 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-6v7mr" Jan 27 16:45:50 crc kubenswrapper[4772]: I0127 16:45:50.663792 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:45:50 crc kubenswrapper[4772]: E0127 16:45:50.664133 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:45:51 crc kubenswrapper[4772]: I0127 16:45:51.309150 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-4q859" Jan 27 16:46:03 crc kubenswrapper[4772]: I0127 16:46:03.663549 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:46:03 crc kubenswrapper[4772]: E0127 16:46:03.669412 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:46:07 crc kubenswrapper[4772]: I0127 16:46:07.904909 4772 scope.go:117] "RemoveContainer" containerID="fba71bd5b1f04f4ce21ecca46c2028b0aeffaa1f208e5b17978c2d0f906cbf36" Jan 27 16:46:09 crc kubenswrapper[4772]: I0127 16:46:09.041463 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rxmk9"] Jan 27 16:46:09 crc kubenswrapper[4772]: I0127 16:46:09.051047 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rxmk9"] Jan 27 16:46:09 crc kubenswrapper[4772]: I0127 16:46:09.061098 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2bff-account-create-update-nq7qs"] Jan 27 16:46:09 crc kubenswrapper[4772]: I0127 16:46:09.068246 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2bff-account-create-update-nq7qs"] Jan 27 16:46:10 crc kubenswrapper[4772]: I0127 16:46:10.677277 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe4269c-3ff7-49b4-82c3-1a419f676f89" path="/var/lib/kubelet/pods/1fe4269c-3ff7-49b4-82c3-1a419f676f89/volumes" Jan 27 16:46:10 crc kubenswrapper[4772]: I0127 16:46:10.679207 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02ab9dd-317c-4787-aa94-0ad8dff15380" path="/var/lib/kubelet/pods/d02ab9dd-317c-4787-aa94-0ad8dff15380/volumes" Jan 27 16:46:15 crc kubenswrapper[4772]: I0127 16:46:15.038569 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2jctn"] Jan 27 16:46:15 crc kubenswrapper[4772]: I0127 16:46:15.047352 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2jctn"] Jan 27 16:46:15 crc kubenswrapper[4772]: I0127 16:46:15.663007 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:46:15 crc kubenswrapper[4772]: E0127 16:46:15.663306 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:46:16 crc kubenswrapper[4772]: I0127 16:46:16.684623 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370ff587-7186-4f81-83a2-886a15900229" path="/var/lib/kubelet/pods/370ff587-7186-4f81-83a2-886a15900229/volumes" Jan 27 16:46:26 crc kubenswrapper[4772]: I0127 16:46:26.662735 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:46:26 crc kubenswrapper[4772]: E0127 16:46:26.663660 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:46:37 crc kubenswrapper[4772]: I0127 16:46:37.663953 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:46:37 crc kubenswrapper[4772]: E0127 16:46:37.664754 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:46:45 crc kubenswrapper[4772]: I0127 16:46:45.055821 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fbd6-account-create-update-kvjsf"] Jan 27 16:46:45 crc kubenswrapper[4772]: I0127 16:46:45.064813 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-49fxh"] Jan 27 16:46:45 crc kubenswrapper[4772]: I0127 16:46:45.073594 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fbd6-account-create-update-kvjsf"] Jan 27 16:46:45 crc kubenswrapper[4772]: I0127 16:46:45.106109 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-49fxh"] Jan 27 16:46:46 crc kubenswrapper[4772]: I0127 16:46:46.681555 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b" path="/var/lib/kubelet/pods/0c10ae3d-af19-4efe-a5b0-783ccf7c5f1b/volumes" Jan 27 16:46:46 crc kubenswrapper[4772]: I0127 16:46:46.684893 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd740e1-b6ee-444d-b71e-e18d4837ef8a" path="/var/lib/kubelet/pods/6dd740e1-b6ee-444d-b71e-e18d4837ef8a/volumes" Jan 27 16:46:48 crc kubenswrapper[4772]: I0127 16:46:48.663782 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:46:49 crc kubenswrapper[4772]: I0127 16:46:49.578114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"fad112d2b60be41180a280804db9237c789adfcc03cd7342f9bc2b818aa8e8b4"} Jan 27 16:46:58 crc kubenswrapper[4772]: I0127 16:46:58.041240 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-xvkmn"] Jan 27 16:46:58 crc kubenswrapper[4772]: I0127 16:46:58.057151 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-xvkmn"] Jan 27 16:46:58 crc kubenswrapper[4772]: I0127 16:46:58.677644 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2" path="/var/lib/kubelet/pods/fce2770e-b90c-4d9c-8a0c-40fe0f48aeb2/volumes" Jan 27 16:47:08 crc kubenswrapper[4772]: I0127 16:47:08.016968 4772 scope.go:117] "RemoveContainer" containerID="5a8e2d8cff2b48fb4540fd3d3c996a83107f6a91a01287bf66fdb29b10e52c6b" Jan 27 16:47:08 crc kubenswrapper[4772]: I0127 16:47:08.063257 4772 scope.go:117] "RemoveContainer" containerID="47404bfd4ce996befd817152654b651725e0f51b6b3b55ee5ee110296d25e0c8" Jan 27 16:47:08 crc kubenswrapper[4772]: I0127 16:47:08.111543 4772 scope.go:117] "RemoveContainer" containerID="0eb151bf9a1bfafe986b27f8d85f2a3fcbd2a1f6be731bcd2ce95359e6a6e136" Jan 27 16:47:08 crc kubenswrapper[4772]: I0127 16:47:08.141231 4772 scope.go:117] "RemoveContainer" containerID="ccfecf4ef23c50d400b8ca7553bbf1eff4d9d62e8caff28f5a3562a7630cf393" Jan 27 16:47:08 crc kubenswrapper[4772]: I0127 16:47:08.201487 4772 scope.go:117] "RemoveContainer" containerID="7820202ed4fe89d2fb42f752ab53ea23a8554f7e3772282ffa645c3878d8acde" Jan 27 16:47:08 crc kubenswrapper[4772]: I0127 16:47:08.224716 4772 scope.go:117] "RemoveContainer" containerID="39e5a537d87039cbc6538b5672d3a7484ab8d88c163808f5672ea54c55115098" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.277293 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-68vzn"] Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.281317 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.284047 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68vzn"] Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.351591 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-catalog-content\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.351679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk77k\" (UniqueName: \"kubernetes.io/projected/9cc67403-20b9-4476-be2c-26639283d648-kube-api-access-lk77k\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.351826 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-utilities\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.453080 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-catalog-content\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.453137 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk77k\" (UniqueName: \"kubernetes.io/projected/9cc67403-20b9-4476-be2c-26639283d648-kube-api-access-lk77k\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.453243 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-utilities\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.453593 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-catalog-content\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.453612 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-utilities\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.470977 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk77k\" (UniqueName: \"kubernetes.io/projected/9cc67403-20b9-4476-be2c-26639283d648-kube-api-access-lk77k\") pod \"redhat-operators-68vzn\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:35 crc kubenswrapper[4772]: I0127 16:47:35.625811 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:36 crc kubenswrapper[4772]: I0127 16:47:36.105098 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68vzn"] Jan 27 16:47:37 crc kubenswrapper[4772]: I0127 16:47:37.096601 4772 generic.go:334] "Generic (PLEG): container finished" podID="9cc67403-20b9-4476-be2c-26639283d648" containerID="bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451" exitCode=0 Jan 27 16:47:37 crc kubenswrapper[4772]: I0127 16:47:37.096680 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerDied","Data":"bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451"} Jan 27 16:47:37 crc kubenswrapper[4772]: I0127 16:47:37.096848 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerStarted","Data":"81502ee598efdd39b85faa6c7f5c54aee251d669948a2c94968e20419d6f1166"} Jan 27 16:47:38 crc kubenswrapper[4772]: I0127 16:47:38.115816 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerStarted","Data":"b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357"} Jan 27 16:47:39 crc kubenswrapper[4772]: I0127 16:47:39.139397 4772 generic.go:334] "Generic (PLEG): container finished" podID="9cc67403-20b9-4476-be2c-26639283d648" containerID="b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357" exitCode=0 Jan 27 16:47:39 crc kubenswrapper[4772]: I0127 16:47:39.139489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerDied","Data":"b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357"} Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.039447 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-ab59-account-create-update-p9dsf"] Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.050746 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ng4h8"] Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.061825 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-ab59-account-create-update-p9dsf"] Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.071761 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ng4h8"] Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.153332 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerStarted","Data":"181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20"} Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.180327 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-68vzn" podStartSLOduration=2.737921604 podStartE2EDuration="5.180309673s" podCreationTimestamp="2026-01-27 16:47:35 +0000 UTC" firstStartedPulling="2026-01-27 16:47:37.099562858 +0000 UTC m=+6043.080171956" lastFinishedPulling="2026-01-27 16:47:39.541950917 +0000 UTC m=+6045.522560025" observedRunningTime="2026-01-27 16:47:40.170448592 +0000 UTC m=+6046.151057720" watchObservedRunningTime="2026-01-27 16:47:40.180309673 +0000 UTC m=+6046.160918771" Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.676848 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b150c73-f6eb-4193-81ef-84941ff1abef" path="/var/lib/kubelet/pods/2b150c73-f6eb-4193-81ef-84941ff1abef/volumes" Jan 27 16:47:40 crc kubenswrapper[4772]: I0127 16:47:40.678266 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615446cc-6fba-46f2-aad9-434f11519be9" path="/var/lib/kubelet/pods/615446cc-6fba-46f2-aad9-434f11519be9/volumes" Jan 27 16:47:45 crc kubenswrapper[4772]: I0127 16:47:45.626561 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:45 crc kubenswrapper[4772]: I0127 16:47:45.627134 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:46 crc kubenswrapper[4772]: I0127 16:47:46.672946 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-68vzn" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="registry-server" probeResult="failure" output=< Jan 27 16:47:46 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 16:47:46 crc kubenswrapper[4772]: > Jan 27 16:47:48 crc kubenswrapper[4772]: I0127 16:47:48.036272 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wlbbm"] Jan 27 16:47:48 crc kubenswrapper[4772]: I0127 16:47:48.047476 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wlbbm"] Jan 27 16:47:48 crc kubenswrapper[4772]: I0127 16:47:48.679791 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac290494-b5ad-4d85-9f14-daf092e3a6ed" path="/var/lib/kubelet/pods/ac290494-b5ad-4d85-9f14-daf092e3a6ed/volumes" Jan 27 16:47:55 crc kubenswrapper[4772]: I0127 16:47:55.688049 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:55 crc kubenswrapper[4772]: I0127 16:47:55.757150 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:55 crc kubenswrapper[4772]: I0127 16:47:55.950946 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68vzn"] Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.341310 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-68vzn" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="registry-server" containerID="cri-o://181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20" gracePeriod=2 Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.807109 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.833877 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-utilities\") pod \"9cc67403-20b9-4476-be2c-26639283d648\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.833950 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-catalog-content\") pod \"9cc67403-20b9-4476-be2c-26639283d648\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.834230 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk77k\" (UniqueName: \"kubernetes.io/projected/9cc67403-20b9-4476-be2c-26639283d648-kube-api-access-lk77k\") pod \"9cc67403-20b9-4476-be2c-26639283d648\" (UID: \"9cc67403-20b9-4476-be2c-26639283d648\") " Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.834784 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-utilities" (OuterVolumeSpecName: "utilities") pod "9cc67403-20b9-4476-be2c-26639283d648" (UID: "9cc67403-20b9-4476-be2c-26639283d648"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.834939 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.839523 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc67403-20b9-4476-be2c-26639283d648-kube-api-access-lk77k" (OuterVolumeSpecName: "kube-api-access-lk77k") pod "9cc67403-20b9-4476-be2c-26639283d648" (UID: "9cc67403-20b9-4476-be2c-26639283d648"). InnerVolumeSpecName "kube-api-access-lk77k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.936756 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk77k\" (UniqueName: \"kubernetes.io/projected/9cc67403-20b9-4476-be2c-26639283d648-kube-api-access-lk77k\") on node \"crc\" DevicePath \"\"" Jan 27 16:47:57 crc kubenswrapper[4772]: I0127 16:47:57.945179 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cc67403-20b9-4476-be2c-26639283d648" (UID: "9cc67403-20b9-4476-be2c-26639283d648"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.040334 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc67403-20b9-4476-be2c-26639283d648-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.356676 4772 generic.go:334] "Generic (PLEG): container finished" podID="9cc67403-20b9-4476-be2c-26639283d648" containerID="181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20" exitCode=0 Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.356733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerDied","Data":"181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20"} Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.356786 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68vzn" event={"ID":"9cc67403-20b9-4476-be2c-26639283d648","Type":"ContainerDied","Data":"81502ee598efdd39b85faa6c7f5c54aee251d669948a2c94968e20419d6f1166"} Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.356787 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68vzn" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.356803 4772 scope.go:117] "RemoveContainer" containerID="181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.402811 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68vzn"] Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.409412 4772 scope.go:117] "RemoveContainer" containerID="b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.410075 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-68vzn"] Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.432520 4772 scope.go:117] "RemoveContainer" containerID="bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.471750 4772 scope.go:117] "RemoveContainer" containerID="181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20" Jan 27 16:47:58 crc kubenswrapper[4772]: E0127 16:47:58.472349 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20\": container with ID starting with 181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20 not found: ID does not exist" containerID="181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.472424 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20"} err="failed to get container status \"181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20\": rpc error: code = NotFound desc = could not find container \"181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20\": container with ID starting with 181b8968a78fb7fd292ffd699059c3cf38d97011a6ab52479b4d69804b746f20 not found: ID does not exist" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.472447 4772 scope.go:117] "RemoveContainer" containerID="b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357" Jan 27 16:47:58 crc kubenswrapper[4772]: E0127 16:47:58.472697 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357\": container with ID starting with b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357 not found: ID does not exist" containerID="b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.472732 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357"} err="failed to get container status \"b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357\": rpc error: code = NotFound desc = could not find container \"b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357\": container with ID starting with b04b8450f2bdaed53e1022022c6c5bc40b03e9ec1d425383d5676d25979a0357 not found: ID does not exist" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.472750 4772 scope.go:117] "RemoveContainer" containerID="bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451" Jan 27 16:47:58 crc kubenswrapper[4772]: E0127 16:47:58.473027 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451\": container with ID starting with bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451 not found: ID does not exist" containerID="bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.473070 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451"} err="failed to get container status \"bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451\": rpc error: code = NotFound desc = could not find container \"bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451\": container with ID starting with bcb1f038aa2b5d55b98fa48261958780ab7273f33a9cbb5d63c0080c7664d451 not found: ID does not exist" Jan 27 16:47:58 crc kubenswrapper[4772]: I0127 16:47:58.677401 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc67403-20b9-4476-be2c-26639283d648" path="/var/lib/kubelet/pods/9cc67403-20b9-4476-be2c-26639283d648/volumes" Jan 27 16:48:08 crc kubenswrapper[4772]: I0127 16:48:08.383297 4772 scope.go:117] "RemoveContainer" containerID="39b253bea060fd0ef46113186002866f36924e03fe711afec1f871af51f40edf" Jan 27 16:48:08 crc kubenswrapper[4772]: I0127 16:48:08.417437 4772 scope.go:117] "RemoveContainer" containerID="972404acaa773e237766879c66e6f19a8b3951bb52780f66333fe8405eb0ccb2" Jan 27 16:48:08 crc kubenswrapper[4772]: I0127 16:48:08.466302 4772 scope.go:117] "RemoveContainer" containerID="04b6871605c5a7ec7b4197615c41edaa0e9453b396fcff037ce590df9243c6a0" Jan 27 16:48:22 crc kubenswrapper[4772]: I0127 16:48:22.064606 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wfhq4"] Jan 27 16:48:22 crc kubenswrapper[4772]: I0127 16:48:22.075023 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c674-account-create-update-zr9fr"] Jan 27 16:48:22 crc kubenswrapper[4772]: I0127 16:48:22.084972 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wfhq4"] Jan 27 16:48:22 crc kubenswrapper[4772]: I0127 16:48:22.094094 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c674-account-create-update-zr9fr"] Jan 27 16:48:22 crc kubenswrapper[4772]: I0127 16:48:22.675792 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6c1c65-36ca-4017-a8e2-5e22a550d601" path="/var/lib/kubelet/pods/5a6c1c65-36ca-4017-a8e2-5e22a550d601/volumes" Jan 27 16:48:22 crc kubenswrapper[4772]: I0127 16:48:22.677645 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c87e84-0237-41a3-b248-59f0e0156b81" path="/var/lib/kubelet/pods/68c87e84-0237-41a3-b248-59f0e0156b81/volumes" Jan 27 16:48:27 crc kubenswrapper[4772]: I0127 16:48:27.032615 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4pvhb"] Jan 27 16:48:27 crc kubenswrapper[4772]: I0127 16:48:27.041827 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4pvhb"] Jan 27 16:48:28 crc kubenswrapper[4772]: I0127 16:48:28.675374 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b972e003-d915-4c6e-b84e-00d1f53740c1" path="/var/lib/kubelet/pods/b972e003-d915-4c6e-b84e-00d1f53740c1/volumes" Jan 27 16:49:08 crc kubenswrapper[4772]: I0127 16:49:08.563515 4772 scope.go:117] "RemoveContainer" containerID="159f9ff911847523ab0387be1212efb17cae848a6dcdc1e80961565a39d1eac9" Jan 27 16:49:08 crc kubenswrapper[4772]: I0127 16:49:08.587540 4772 scope.go:117] "RemoveContainer" containerID="fffc3d88e7cbd76e4f8c55e9d4f80e7dfb6325460b6429ed67f89d060f3c380c" Jan 27 16:49:08 crc kubenswrapper[4772]: I0127 16:49:08.631968 4772 scope.go:117] "RemoveContainer" containerID="897a6f6215480fb2b302f194b142ae62f3461ba459140cef6dbb5530febc39e7" Jan 27 16:49:12 crc kubenswrapper[4772]: I0127 16:49:12.058907 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:49:12 crc kubenswrapper[4772]: I0127 16:49:12.059598 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:49:24 crc kubenswrapper[4772]: I0127 16:49:24.048999 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-5g8c9"] Jan 27 16:49:24 crc kubenswrapper[4772]: I0127 16:49:24.058723 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wmvjd"] Jan 27 16:49:24 crc kubenswrapper[4772]: I0127 16:49:24.078881 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-5g8c9"] Jan 27 16:49:24 crc kubenswrapper[4772]: I0127 16:49:24.095346 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wmvjd"] Jan 27 16:49:24 crc kubenswrapper[4772]: I0127 16:49:24.681208 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b3d773-720e-42c5-af9e-abddc2180ac7" path="/var/lib/kubelet/pods/81b3d773-720e-42c5-af9e-abddc2180ac7/volumes" Jan 27 16:49:24 crc kubenswrapper[4772]: I0127 16:49:24.683131 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952d9a1e-efbf-4617-94af-b5ad42cce494" path="/var/lib/kubelet/pods/952d9a1e-efbf-4617-94af-b5ad42cce494/volumes" Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.034205 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a20c-account-create-update-7pf8l"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.046056 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a20c-account-create-update-7pf8l"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.057335 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a831-account-create-update-bq6jl"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.066700 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-fxhzl"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.076738 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e098-account-create-update-s8qnw"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.086526 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a831-account-create-update-bq6jl"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.096433 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-fxhzl"] Jan 27 16:49:25 crc kubenswrapper[4772]: I0127 16:49:25.106611 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e098-account-create-update-s8qnw"] Jan 27 16:49:26 crc kubenswrapper[4772]: I0127 16:49:26.674987 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4545581d-5f56-406d-938f-c3b073fdcbce" path="/var/lib/kubelet/pods/4545581d-5f56-406d-938f-c3b073fdcbce/volumes" Jan 27 16:49:26 crc kubenswrapper[4772]: I0127 16:49:26.676234 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c86af5-fd1f-4c53-8978-2b436db59b2a" path="/var/lib/kubelet/pods/84c86af5-fd1f-4c53-8978-2b436db59b2a/volumes" Jan 27 16:49:26 crc kubenswrapper[4772]: I0127 16:49:26.677481 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f6e427-99ce-4873-bacc-697edca3d34e" path="/var/lib/kubelet/pods/86f6e427-99ce-4873-bacc-697edca3d34e/volumes" Jan 27 16:49:26 crc kubenswrapper[4772]: I0127 16:49:26.678774 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e808813a-e588-4fb9-a15d-588d94a4cd59" path="/var/lib/kubelet/pods/e808813a-e588-4fb9-a15d-588d94a4cd59/volumes" Jan 27 16:49:39 crc kubenswrapper[4772]: I0127 16:49:39.049718 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k56t7"] Jan 27 16:49:39 crc kubenswrapper[4772]: I0127 16:49:39.061949 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k56t7"] Jan 27 16:49:40 crc kubenswrapper[4772]: I0127 16:49:40.675226 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eedd23ad-e532-401a-a991-4bca54fc2711" path="/var/lib/kubelet/pods/eedd23ad-e532-401a-a991-4bca54fc2711/volumes" Jan 27 16:49:42 crc kubenswrapper[4772]: I0127 16:49:42.058105 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:49:42 crc kubenswrapper[4772]: I0127 16:49:42.058648 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.236432 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gwt4f"] Jan 27 16:49:57 crc kubenswrapper[4772]: E0127 16:49:57.237429 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="extract-content" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.237447 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="extract-content" Jan 27 16:49:57 crc kubenswrapper[4772]: E0127 16:49:57.237460 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="extract-utilities" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.237468 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="extract-utilities" Jan 27 16:49:57 crc kubenswrapper[4772]: E0127 16:49:57.237484 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="registry-server" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.237493 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="registry-server" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.237716 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc67403-20b9-4476-be2c-26639283d648" containerName="registry-server" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.239263 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.252678 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwt4f"] Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.358088 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rmz\" (UniqueName: \"kubernetes.io/projected/f667e2c1-1507-4b30-9483-15848686b8b6-kube-api-access-x4rmz\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.358374 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-catalog-content\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.358614 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-utilities\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.460634 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rmz\" (UniqueName: \"kubernetes.io/projected/f667e2c1-1507-4b30-9483-15848686b8b6-kube-api-access-x4rmz\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.460760 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-catalog-content\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.460796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-utilities\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.461263 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-utilities\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.461355 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-catalog-content\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.482732 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rmz\" (UniqueName: \"kubernetes.io/projected/f667e2c1-1507-4b30-9483-15848686b8b6-kube-api-access-x4rmz\") pod \"community-operators-gwt4f\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:57 crc kubenswrapper[4772]: I0127 16:49:57.560159 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.030308 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dk47"] Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.040349 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9dk47"] Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.125127 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gwt4f"] Jan 27 16:49:58 crc kubenswrapper[4772]: W0127 16:49:58.134811 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf667e2c1_1507_4b30_9483_15848686b8b6.slice/crio-3b5d228719dda3b1a910a73f9e85ee82e7079ab00ea680955c59d6f4b081932e WatchSource:0}: Error finding container 3b5d228719dda3b1a910a73f9e85ee82e7079ab00ea680955c59d6f4b081932e: Status 404 returned error can't find the container with id 3b5d228719dda3b1a910a73f9e85ee82e7079ab00ea680955c59d6f4b081932e Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.674251 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73df4be-e448-4930-ae5e-d74fde1b4b6d" path="/var/lib/kubelet/pods/c73df4be-e448-4930-ae5e-d74fde1b4b6d/volumes" Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.886392 4772 generic.go:334] "Generic (PLEG): container finished" podID="f667e2c1-1507-4b30-9483-15848686b8b6" containerID="888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d" exitCode=0 Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.886447 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerDied","Data":"888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d"} Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.886505 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerStarted","Data":"3b5d228719dda3b1a910a73f9e85ee82e7079ab00ea680955c59d6f4b081932e"} Jan 27 16:49:58 crc kubenswrapper[4772]: I0127 16:49:58.888628 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:49:59 crc kubenswrapper[4772]: I0127 16:49:59.044891 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-9vmft"] Jan 27 16:49:59 crc kubenswrapper[4772]: I0127 16:49:59.052066 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-9vmft"] Jan 27 16:49:59 crc kubenswrapper[4772]: I0127 16:49:59.898040 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerStarted","Data":"43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0"} Jan 27 16:50:00 crc kubenswrapper[4772]: I0127 16:50:00.683893 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c79395-a929-4f0d-8aa7-05f24412baed" path="/var/lib/kubelet/pods/f4c79395-a929-4f0d-8aa7-05f24412baed/volumes" Jan 27 16:50:00 crc kubenswrapper[4772]: I0127 16:50:00.914465 4772 generic.go:334] "Generic (PLEG): container finished" podID="f667e2c1-1507-4b30-9483-15848686b8b6" containerID="43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0" exitCode=0 Jan 27 16:50:00 crc kubenswrapper[4772]: I0127 16:50:00.914554 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerDied","Data":"43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0"} Jan 27 16:50:01 crc kubenswrapper[4772]: I0127 16:50:01.942144 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerStarted","Data":"9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850"} Jan 27 16:50:01 crc kubenswrapper[4772]: I0127 16:50:01.968209 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gwt4f" podStartSLOduration=2.546167335 podStartE2EDuration="4.96818838s" podCreationTimestamp="2026-01-27 16:49:57 +0000 UTC" firstStartedPulling="2026-01-27 16:49:58.888406446 +0000 UTC m=+6184.869015544" lastFinishedPulling="2026-01-27 16:50:01.310427491 +0000 UTC m=+6187.291036589" observedRunningTime="2026-01-27 16:50:01.959640306 +0000 UTC m=+6187.940249414" watchObservedRunningTime="2026-01-27 16:50:01.96818838 +0000 UTC m=+6187.948797498" Jan 27 16:50:07 crc kubenswrapper[4772]: I0127 16:50:07.561285 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:50:07 crc kubenswrapper[4772]: I0127 16:50:07.562255 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:50:07 crc kubenswrapper[4772]: I0127 16:50:07.603216 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:50:08 crc kubenswrapper[4772]: I0127 16:50:08.042679 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:50:08 crc kubenswrapper[4772]: I0127 16:50:08.768510 4772 scope.go:117] "RemoveContainer" containerID="f05adee94e87980f08389d2716dd0d5a92a148aff23a1b26c057f06fd19c6f9b" Jan 27 16:50:08 crc kubenswrapper[4772]: I0127 16:50:08.799560 4772 scope.go:117] "RemoveContainer" containerID="b2dc1b1a95dcd405b12ae74c124c7458f4aa13c48525ad20496373458b83b670" Jan 27 16:50:08 crc kubenswrapper[4772]: I0127 16:50:08.855508 4772 scope.go:117] "RemoveContainer" containerID="4799caa746e9ac611c5360c3f4fb6d88e8100346e94f7137cac433d747b98815" Jan 27 16:50:08 crc kubenswrapper[4772]: I0127 16:50:08.929525 4772 scope.go:117] "RemoveContainer" containerID="63117e003669f061743dd5211454454c043cc51173da6608feb5766b651070d3" Jan 27 16:50:08 crc kubenswrapper[4772]: I0127 16:50:08.976691 4772 scope.go:117] "RemoveContainer" containerID="cba54a4b30bd539d4b80fc206c2394f0309c6182e722300037332e6e39b8dedb" Jan 27 16:50:09 crc kubenswrapper[4772]: I0127 16:50:09.016743 4772 scope.go:117] "RemoveContainer" containerID="0f3d5e3b05300094485e382bc00ab51b2c741fade4c1a775474788ecee8633d6" Jan 27 16:50:09 crc kubenswrapper[4772]: I0127 16:50:09.073475 4772 scope.go:117] "RemoveContainer" containerID="83a6d65ba439c93f6ff663d0a68308cf85e00e3f86836413a3077a6bf72f351a" Jan 27 16:50:09 crc kubenswrapper[4772]: I0127 16:50:09.095324 4772 scope.go:117] "RemoveContainer" containerID="cd101790786da89124bf4fdc7b4bace7b135a40af114a9eeb7dbc7a4372fd732" Jan 27 16:50:09 crc kubenswrapper[4772]: I0127 16:50:09.135768 4772 scope.go:117] "RemoveContainer" containerID="92b70acb37a2142424102ba84a18ba6908a56b707c2a540be2311cd899ea872a" Jan 27 16:50:10 crc kubenswrapper[4772]: I0127 16:50:10.026511 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwt4f"] Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.036605 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gwt4f" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="registry-server" containerID="cri-o://9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850" gracePeriod=2 Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.489963 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.566351 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-catalog-content\") pod \"f667e2c1-1507-4b30-9483-15848686b8b6\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.566406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4rmz\" (UniqueName: \"kubernetes.io/projected/f667e2c1-1507-4b30-9483-15848686b8b6-kube-api-access-x4rmz\") pod \"f667e2c1-1507-4b30-9483-15848686b8b6\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.566613 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-utilities\") pod \"f667e2c1-1507-4b30-9483-15848686b8b6\" (UID: \"f667e2c1-1507-4b30-9483-15848686b8b6\") " Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.567594 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-utilities" (OuterVolumeSpecName: "utilities") pod "f667e2c1-1507-4b30-9483-15848686b8b6" (UID: "f667e2c1-1507-4b30-9483-15848686b8b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.572876 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f667e2c1-1507-4b30-9483-15848686b8b6-kube-api-access-x4rmz" (OuterVolumeSpecName: "kube-api-access-x4rmz") pod "f667e2c1-1507-4b30-9483-15848686b8b6" (UID: "f667e2c1-1507-4b30-9483-15848686b8b6"). InnerVolumeSpecName "kube-api-access-x4rmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.620417 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f667e2c1-1507-4b30-9483-15848686b8b6" (UID: "f667e2c1-1507-4b30-9483-15848686b8b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.668955 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.668991 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f667e2c1-1507-4b30-9483-15848686b8b6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:50:11 crc kubenswrapper[4772]: I0127 16:50:11.669001 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4rmz\" (UniqueName: \"kubernetes.io/projected/f667e2c1-1507-4b30-9483-15848686b8b6-kube-api-access-x4rmz\") on node \"crc\" DevicePath \"\"" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.050422 4772 generic.go:334] "Generic (PLEG): container finished" podID="f667e2c1-1507-4b30-9483-15848686b8b6" containerID="9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850" exitCode=0 Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.050478 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerDied","Data":"9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850"} Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.050503 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gwt4f" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.050520 4772 scope.go:117] "RemoveContainer" containerID="9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.050509 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gwt4f" event={"ID":"f667e2c1-1507-4b30-9483-15848686b8b6","Type":"ContainerDied","Data":"3b5d228719dda3b1a910a73f9e85ee82e7079ab00ea680955c59d6f4b081932e"} Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.058372 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.058420 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.058460 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.059762 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fad112d2b60be41180a280804db9237c789adfcc03cd7342f9bc2b818aa8e8b4"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.059827 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://fad112d2b60be41180a280804db9237c789adfcc03cd7342f9bc2b818aa8e8b4" gracePeriod=600 Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.075677 4772 scope.go:117] "RemoveContainer" containerID="43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.083479 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gwt4f"] Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.095096 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gwt4f"] Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.106636 4772 scope.go:117] "RemoveContainer" containerID="888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.139538 4772 scope.go:117] "RemoveContainer" containerID="9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850" Jan 27 16:50:12 crc kubenswrapper[4772]: E0127 16:50:12.139931 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850\": container with ID starting with 9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850 not found: ID does not exist" containerID="9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.139988 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850"} err="failed to get container status \"9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850\": rpc error: code = NotFound desc = could not find container \"9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850\": container with ID starting with 9b2105b198f7a6cee59ace3055430b991acc811a7d60365d1268136259fcd850 not found: ID does not exist" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.140019 4772 scope.go:117] "RemoveContainer" containerID="43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0" Jan 27 16:50:12 crc kubenswrapper[4772]: E0127 16:50:12.140370 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0\": container with ID starting with 43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0 not found: ID does not exist" containerID="43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.140403 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0"} err="failed to get container status \"43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0\": rpc error: code = NotFound desc = could not find container \"43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0\": container with ID starting with 43917f23ef47a782c97321d1bc6408d80aea5b744655f841e67b5391ae8c40f0 not found: ID does not exist" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.140425 4772 scope.go:117] "RemoveContainer" containerID="888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d" Jan 27 16:50:12 crc kubenswrapper[4772]: E0127 16:50:12.140684 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d\": container with ID starting with 888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d not found: ID does not exist" containerID="888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.140712 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d"} err="failed to get container status \"888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d\": rpc error: code = NotFound desc = could not find container \"888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d\": container with ID starting with 888b7a4fd3e63152f4d8fa6e54cda7273f183ea1cf263815331a13ae71b4412d not found: ID does not exist" Jan 27 16:50:12 crc kubenswrapper[4772]: I0127 16:50:12.673681 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" path="/var/lib/kubelet/pods/f667e2c1-1507-4b30-9483-15848686b8b6/volumes" Jan 27 16:50:13 crc kubenswrapper[4772]: I0127 16:50:13.093713 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="fad112d2b60be41180a280804db9237c789adfcc03cd7342f9bc2b818aa8e8b4" exitCode=0 Jan 27 16:50:13 crc kubenswrapper[4772]: I0127 16:50:13.093772 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"fad112d2b60be41180a280804db9237c789adfcc03cd7342f9bc2b818aa8e8b4"} Jan 27 16:50:13 crc kubenswrapper[4772]: I0127 16:50:13.094090 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317"} Jan 27 16:50:13 crc kubenswrapper[4772]: I0127 16:50:13.094119 4772 scope.go:117] "RemoveContainer" containerID="96a40f4ae71e3b6b4ac45c7d87f99fc2edaa1544245388fae41f53b32b3f5a69" Jan 27 16:50:17 crc kubenswrapper[4772]: I0127 16:50:17.072662 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-p6mbs"] Jan 27 16:50:17 crc kubenswrapper[4772]: I0127 16:50:17.085854 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-p6mbs"] Jan 27 16:50:18 crc kubenswrapper[4772]: I0127 16:50:18.677023 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaefb4fd-175a-4431-bd2f-7fc3205684b9" path="/var/lib/kubelet/pods/eaefb4fd-175a-4431-bd2f-7fc3205684b9/volumes" Jan 27 16:51:01 crc kubenswrapper[4772]: I0127 16:51:01.039301 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-hk6d6"] Jan 27 16:51:01 crc kubenswrapper[4772]: I0127 16:51:01.048196 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-e17c-account-create-update-4ccmc"] Jan 27 16:51:01 crc kubenswrapper[4772]: I0127 16:51:01.057989 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-e17c-account-create-update-4ccmc"] Jan 27 16:51:01 crc kubenswrapper[4772]: I0127 16:51:01.066443 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-hk6d6"] Jan 27 16:51:02 crc kubenswrapper[4772]: I0127 16:51:02.675033 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357c50eb-1246-4dd8-975c-b10d09439cbd" path="/var/lib/kubelet/pods/357c50eb-1246-4dd8-975c-b10d09439cbd/volumes" Jan 27 16:51:02 crc kubenswrapper[4772]: I0127 16:51:02.675791 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67b98cc-0659-4ebd-a96a-025044731558" path="/var/lib/kubelet/pods/d67b98cc-0659-4ebd-a96a-025044731558/volumes" Jan 27 16:51:09 crc kubenswrapper[4772]: I0127 16:51:09.289081 4772 scope.go:117] "RemoveContainer" containerID="2746702aa9709faf5d661e469190b51cee57d01ba7b40596d6751ad98441dd4d" Jan 27 16:51:09 crc kubenswrapper[4772]: I0127 16:51:09.323971 4772 scope.go:117] "RemoveContainer" containerID="f57fe908a2ad59651426657b881a5c35f1371cdc115dc93b86dbcb58952ce8a9" Jan 27 16:51:09 crc kubenswrapper[4772]: I0127 16:51:09.419885 4772 scope.go:117] "RemoveContainer" containerID="2c24a009396935ad397dec4435d87a19075c63011ad8f23a534d35cc814f6ddc" Jan 27 16:51:09 crc kubenswrapper[4772]: I0127 16:51:09.454656 4772 scope.go:117] "RemoveContainer" containerID="dab4307441a9ab5d1178a5547c03edae0ead3f783db4ef2bf29e8414026bb08f" Jan 27 16:51:12 crc kubenswrapper[4772]: I0127 16:51:12.056478 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-66rdf"] Jan 27 16:51:12 crc kubenswrapper[4772]: I0127 16:51:12.068295 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-66rdf"] Jan 27 16:51:12 crc kubenswrapper[4772]: I0127 16:51:12.680005 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9bc1548-ca21-4230-a5db-a9321ab69a37" path="/var/lib/kubelet/pods/e9bc1548-ca21-4230-a5db-a9321ab69a37/volumes" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.136526 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bsdgv"] Jan 27 16:51:21 crc kubenswrapper[4772]: E0127 16:51:21.137521 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="registry-server" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.137533 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="registry-server" Jan 27 16:51:21 crc kubenswrapper[4772]: E0127 16:51:21.137549 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="extract-utilities" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.137557 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="extract-utilities" Jan 27 16:51:21 crc kubenswrapper[4772]: E0127 16:51:21.137585 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="extract-content" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.137591 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="extract-content" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.137764 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f667e2c1-1507-4b30-9483-15848686b8b6" containerName="registry-server" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.139025 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.162659 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsdgv"] Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.218977 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-catalog-content\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.219122 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-utilities\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.219280 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xck\" (UniqueName: \"kubernetes.io/projected/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-kube-api-access-h7xck\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.321716 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xck\" (UniqueName: \"kubernetes.io/projected/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-kube-api-access-h7xck\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.322251 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-catalog-content\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.322382 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-utilities\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.322963 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-catalog-content\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.323110 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-utilities\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.343907 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xck\" (UniqueName: \"kubernetes.io/projected/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-kube-api-access-h7xck\") pod \"redhat-marketplace-bsdgv\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.463741 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:21 crc kubenswrapper[4772]: I0127 16:51:21.928879 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsdgv"] Jan 27 16:51:21 crc kubenswrapper[4772]: W0127 16:51:21.930293 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode094ebf0_a014_4071_974b_f8cf8bf8a1d3.slice/crio-457497aa77a154894b046aead34072922305c6805acc56f00078d0211b9d58de WatchSource:0}: Error finding container 457497aa77a154894b046aead34072922305c6805acc56f00078d0211b9d58de: Status 404 returned error can't find the container with id 457497aa77a154894b046aead34072922305c6805acc56f00078d0211b9d58de Jan 27 16:51:22 crc kubenswrapper[4772]: I0127 16:51:22.795758 4772 generic.go:334] "Generic (PLEG): container finished" podID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerID="d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0" exitCode=0 Jan 27 16:51:22 crc kubenswrapper[4772]: I0127 16:51:22.795809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsdgv" event={"ID":"e094ebf0-a014-4071-974b-f8cf8bf8a1d3","Type":"ContainerDied","Data":"d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0"} Jan 27 16:51:22 crc kubenswrapper[4772]: I0127 16:51:22.796285 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsdgv" event={"ID":"e094ebf0-a014-4071-974b-f8cf8bf8a1d3","Type":"ContainerStarted","Data":"457497aa77a154894b046aead34072922305c6805acc56f00078d0211b9d58de"} Jan 27 16:51:23 crc kubenswrapper[4772]: I0127 16:51:23.806641 4772 generic.go:334] "Generic (PLEG): container finished" podID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerID="96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef" exitCode=0 Jan 27 16:51:23 crc kubenswrapper[4772]: I0127 16:51:23.806731 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsdgv" event={"ID":"e094ebf0-a014-4071-974b-f8cf8bf8a1d3","Type":"ContainerDied","Data":"96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef"} Jan 27 16:51:24 crc kubenswrapper[4772]: I0127 16:51:24.823273 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsdgv" event={"ID":"e094ebf0-a014-4071-974b-f8cf8bf8a1d3","Type":"ContainerStarted","Data":"924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe"} Jan 27 16:51:24 crc kubenswrapper[4772]: I0127 16:51:24.849698 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bsdgv" podStartSLOduration=2.406793976 podStartE2EDuration="3.849677539s" podCreationTimestamp="2026-01-27 16:51:21 +0000 UTC" firstStartedPulling="2026-01-27 16:51:22.79921052 +0000 UTC m=+6268.779819628" lastFinishedPulling="2026-01-27 16:51:24.242094063 +0000 UTC m=+6270.222703191" observedRunningTime="2026-01-27 16:51:24.845604853 +0000 UTC m=+6270.826213981" watchObservedRunningTime="2026-01-27 16:51:24.849677539 +0000 UTC m=+6270.830286657" Jan 27 16:51:31 crc kubenswrapper[4772]: I0127 16:51:31.464046 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:31 crc kubenswrapper[4772]: I0127 16:51:31.464749 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:31 crc kubenswrapper[4772]: I0127 16:51:31.511727 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:31 crc kubenswrapper[4772]: I0127 16:51:31.959590 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:32 crc kubenswrapper[4772]: I0127 16:51:32.007452 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsdgv"] Jan 27 16:51:33 crc kubenswrapper[4772]: I0127 16:51:33.926701 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bsdgv" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="registry-server" containerID="cri-o://924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe" gracePeriod=2 Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.879561 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.900432 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-catalog-content\") pod \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.900783 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7xck\" (UniqueName: \"kubernetes.io/projected/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-kube-api-access-h7xck\") pod \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.902024 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-utilities\") pod \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\" (UID: \"e094ebf0-a014-4071-974b-f8cf8bf8a1d3\") " Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.902676 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-utilities" (OuterVolumeSpecName: "utilities") pod "e094ebf0-a014-4071-974b-f8cf8bf8a1d3" (UID: "e094ebf0-a014-4071-974b-f8cf8bf8a1d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.907744 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-kube-api-access-h7xck" (OuterVolumeSpecName: "kube-api-access-h7xck") pod "e094ebf0-a014-4071-974b-f8cf8bf8a1d3" (UID: "e094ebf0-a014-4071-974b-f8cf8bf8a1d3"). InnerVolumeSpecName "kube-api-access-h7xck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.927655 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e094ebf0-a014-4071-974b-f8cf8bf8a1d3" (UID: "e094ebf0-a014-4071-974b-f8cf8bf8a1d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.939045 4772 generic.go:334] "Generic (PLEG): container finished" podID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerID="924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe" exitCode=0 Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.939083 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsdgv" event={"ID":"e094ebf0-a014-4071-974b-f8cf8bf8a1d3","Type":"ContainerDied","Data":"924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe"} Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.939111 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bsdgv" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.939125 4772 scope.go:117] "RemoveContainer" containerID="924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.939115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bsdgv" event={"ID":"e094ebf0-a014-4071-974b-f8cf8bf8a1d3","Type":"ContainerDied","Data":"457497aa77a154894b046aead34072922305c6805acc56f00078d0211b9d58de"} Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.975179 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsdgv"] Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.975673 4772 scope.go:117] "RemoveContainer" containerID="96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef" Jan 27 16:51:34 crc kubenswrapper[4772]: I0127 16:51:34.986325 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bsdgv"] Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.000597 4772 scope.go:117] "RemoveContainer" containerID="d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.004419 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7xck\" (UniqueName: \"kubernetes.io/projected/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-kube-api-access-h7xck\") on node \"crc\" DevicePath \"\"" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.004443 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.004457 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e094ebf0-a014-4071-974b-f8cf8bf8a1d3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.045021 4772 scope.go:117] "RemoveContainer" containerID="924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe" Jan 27 16:51:35 crc kubenswrapper[4772]: E0127 16:51:35.045521 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe\": container with ID starting with 924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe not found: ID does not exist" containerID="924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.045557 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe"} err="failed to get container status \"924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe\": rpc error: code = NotFound desc = could not find container \"924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe\": container with ID starting with 924cb21c049d7e2b564fcdeda1cd8c7d8b841546eb43ece1a79a291088be7efe not found: ID does not exist" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.045581 4772 scope.go:117] "RemoveContainer" containerID="96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef" Jan 27 16:51:35 crc kubenswrapper[4772]: E0127 16:51:35.046010 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef\": container with ID starting with 96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef not found: ID does not exist" containerID="96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.046046 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef"} err="failed to get container status \"96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef\": rpc error: code = NotFound desc = could not find container \"96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef\": container with ID starting with 96d3398aa45e14a1610aa742949aabdd7940c244b6689f7f5e0aecbb8e1a59ef not found: ID does not exist" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.046087 4772 scope.go:117] "RemoveContainer" containerID="d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0" Jan 27 16:51:35 crc kubenswrapper[4772]: E0127 16:51:35.046524 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0\": container with ID starting with d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0 not found: ID does not exist" containerID="d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0" Jan 27 16:51:35 crc kubenswrapper[4772]: I0127 16:51:35.046546 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0"} err="failed to get container status \"d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0\": rpc error: code = NotFound desc = could not find container \"d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0\": container with ID starting with d780cae7ced2ecb0446c90d88daae5a5876a08e50de72a90ffd7cbde562106a0 not found: ID does not exist" Jan 27 16:51:36 crc kubenswrapper[4772]: I0127 16:51:36.677078 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" path="/var/lib/kubelet/pods/e094ebf0-a014-4071-974b-f8cf8bf8a1d3/volumes" Jan 27 16:52:09 crc kubenswrapper[4772]: I0127 16:52:09.588158 4772 scope.go:117] "RemoveContainer" containerID="d0051faf5f33fa9d044b3023d9e4654d63902fac62af135831e6d6e9a248c7b6" Jan 27 16:52:12 crc kubenswrapper[4772]: I0127 16:52:12.058872 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:52:12 crc kubenswrapper[4772]: I0127 16:52:12.059190 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:52:42 crc kubenswrapper[4772]: I0127 16:52:42.059305 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:52:42 crc kubenswrapper[4772]: I0127 16:52:42.060127 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.058655 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.059566 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.059633 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.061200 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.061312 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" gracePeriod=600 Jan 27 16:53:12 crc kubenswrapper[4772]: E0127 16:53:12.197368 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.926849 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" exitCode=0 Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.926910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317"} Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.926985 4772 scope.go:117] "RemoveContainer" containerID="fad112d2b60be41180a280804db9237c789adfcc03cd7342f9bc2b818aa8e8b4" Jan 27 16:53:12 crc kubenswrapper[4772]: I0127 16:53:12.928012 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:53:12 crc kubenswrapper[4772]: E0127 16:53:12.928495 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.297863 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59lfv"] Jan 27 16:53:19 crc kubenswrapper[4772]: E0127 16:53:19.298787 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="extract-utilities" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.298802 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="extract-utilities" Jan 27 16:53:19 crc kubenswrapper[4772]: E0127 16:53:19.298839 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="registry-server" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.298847 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="registry-server" Jan 27 16:53:19 crc kubenswrapper[4772]: E0127 16:53:19.298866 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="extract-content" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.298874 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="extract-content" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.299078 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e094ebf0-a014-4071-974b-f8cf8bf8a1d3" containerName="registry-server" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.300704 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.336884 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59lfv"] Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.416198 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bw2\" (UniqueName: \"kubernetes.io/projected/e447eca7-2085-4dde-84ff-0e85dccea5e9-kube-api-access-g5bw2\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.416244 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-catalog-content\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.416280 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-utilities\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.517696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-utilities\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.518133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bw2\" (UniqueName: \"kubernetes.io/projected/e447eca7-2085-4dde-84ff-0e85dccea5e9-kube-api-access-g5bw2\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.518273 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-catalog-content\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.518179 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-utilities\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.518746 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-catalog-content\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.543153 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bw2\" (UniqueName: \"kubernetes.io/projected/e447eca7-2085-4dde-84ff-0e85dccea5e9-kube-api-access-g5bw2\") pod \"certified-operators-59lfv\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:19 crc kubenswrapper[4772]: I0127 16:53:19.672950 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:20 crc kubenswrapper[4772]: W0127 16:53:20.228484 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode447eca7_2085_4dde_84ff_0e85dccea5e9.slice/crio-d822540001ec1fd39e430aef97cccf0e570b9021e0151dc4cfcb80bcdd01b888 WatchSource:0}: Error finding container d822540001ec1fd39e430aef97cccf0e570b9021e0151dc4cfcb80bcdd01b888: Status 404 returned error can't find the container with id d822540001ec1fd39e430aef97cccf0e570b9021e0151dc4cfcb80bcdd01b888 Jan 27 16:53:20 crc kubenswrapper[4772]: I0127 16:53:20.228545 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59lfv"] Jan 27 16:53:21 crc kubenswrapper[4772]: I0127 16:53:21.026063 4772 generic.go:334] "Generic (PLEG): container finished" podID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerID="59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d" exitCode=0 Jan 27 16:53:21 crc kubenswrapper[4772]: I0127 16:53:21.026333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerDied","Data":"59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d"} Jan 27 16:53:21 crc kubenswrapper[4772]: I0127 16:53:21.026480 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerStarted","Data":"d822540001ec1fd39e430aef97cccf0e570b9021e0151dc4cfcb80bcdd01b888"} Jan 27 16:53:22 crc kubenswrapper[4772]: I0127 16:53:22.039699 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerStarted","Data":"4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf"} Jan 27 16:53:23 crc kubenswrapper[4772]: I0127 16:53:23.057345 4772 generic.go:334] "Generic (PLEG): container finished" podID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerID="4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf" exitCode=0 Jan 27 16:53:23 crc kubenswrapper[4772]: I0127 16:53:23.057586 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerDied","Data":"4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf"} Jan 27 16:53:24 crc kubenswrapper[4772]: I0127 16:53:24.066525 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerStarted","Data":"e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f"} Jan 27 16:53:24 crc kubenswrapper[4772]: I0127 16:53:24.090422 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59lfv" podStartSLOduration=2.6734818860000003 podStartE2EDuration="5.090402047s" podCreationTimestamp="2026-01-27 16:53:19 +0000 UTC" firstStartedPulling="2026-01-27 16:53:21.031015544 +0000 UTC m=+6387.011624672" lastFinishedPulling="2026-01-27 16:53:23.447935735 +0000 UTC m=+6389.428544833" observedRunningTime="2026-01-27 16:53:24.083733267 +0000 UTC m=+6390.064342385" watchObservedRunningTime="2026-01-27 16:53:24.090402047 +0000 UTC m=+6390.071011155" Jan 27 16:53:27 crc kubenswrapper[4772]: I0127 16:53:27.663125 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:53:27 crc kubenswrapper[4772]: E0127 16:53:27.664344 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:53:29 crc kubenswrapper[4772]: I0127 16:53:29.673435 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:29 crc kubenswrapper[4772]: I0127 16:53:29.673752 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:29 crc kubenswrapper[4772]: I0127 16:53:29.715499 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:30 crc kubenswrapper[4772]: I0127 16:53:30.198362 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:30 crc kubenswrapper[4772]: I0127 16:53:30.264108 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59lfv"] Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.153223 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59lfv" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="registry-server" containerID="cri-o://e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f" gracePeriod=2 Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.664746 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.723521 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bw2\" (UniqueName: \"kubernetes.io/projected/e447eca7-2085-4dde-84ff-0e85dccea5e9-kube-api-access-g5bw2\") pod \"e447eca7-2085-4dde-84ff-0e85dccea5e9\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.723608 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-catalog-content\") pod \"e447eca7-2085-4dde-84ff-0e85dccea5e9\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.723780 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-utilities\") pod \"e447eca7-2085-4dde-84ff-0e85dccea5e9\" (UID: \"e447eca7-2085-4dde-84ff-0e85dccea5e9\") " Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.725676 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-utilities" (OuterVolumeSpecName: "utilities") pod "e447eca7-2085-4dde-84ff-0e85dccea5e9" (UID: "e447eca7-2085-4dde-84ff-0e85dccea5e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.730959 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e447eca7-2085-4dde-84ff-0e85dccea5e9-kube-api-access-g5bw2" (OuterVolumeSpecName: "kube-api-access-g5bw2") pod "e447eca7-2085-4dde-84ff-0e85dccea5e9" (UID: "e447eca7-2085-4dde-84ff-0e85dccea5e9"). InnerVolumeSpecName "kube-api-access-g5bw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.774321 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e447eca7-2085-4dde-84ff-0e85dccea5e9" (UID: "e447eca7-2085-4dde-84ff-0e85dccea5e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.826288 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.826317 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5bw2\" (UniqueName: \"kubernetes.io/projected/e447eca7-2085-4dde-84ff-0e85dccea5e9-kube-api-access-g5bw2\") on node \"crc\" DevicePath \"\"" Jan 27 16:53:32 crc kubenswrapper[4772]: I0127 16:53:32.826328 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e447eca7-2085-4dde-84ff-0e85dccea5e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.166445 4772 generic.go:334] "Generic (PLEG): container finished" podID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerID="e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f" exitCode=0 Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.166497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerDied","Data":"e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f"} Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.166559 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59lfv" event={"ID":"e447eca7-2085-4dde-84ff-0e85dccea5e9","Type":"ContainerDied","Data":"d822540001ec1fd39e430aef97cccf0e570b9021e0151dc4cfcb80bcdd01b888"} Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.166584 4772 scope.go:117] "RemoveContainer" containerID="e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.166583 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59lfv" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.205202 4772 scope.go:117] "RemoveContainer" containerID="4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.207088 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59lfv"] Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.218210 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59lfv"] Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.237628 4772 scope.go:117] "RemoveContainer" containerID="59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.294821 4772 scope.go:117] "RemoveContainer" containerID="e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f" Jan 27 16:53:33 crc kubenswrapper[4772]: E0127 16:53:33.295256 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f\": container with ID starting with e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f not found: ID does not exist" containerID="e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.295297 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f"} err="failed to get container status \"e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f\": rpc error: code = NotFound desc = could not find container \"e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f\": container with ID starting with e22a8458896bb5d1120cb8870b4e1750f928e8189a6f596227508efe1f8e459f not found: ID does not exist" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.295325 4772 scope.go:117] "RemoveContainer" containerID="4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf" Jan 27 16:53:33 crc kubenswrapper[4772]: E0127 16:53:33.295658 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf\": container with ID starting with 4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf not found: ID does not exist" containerID="4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.295698 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf"} err="failed to get container status \"4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf\": rpc error: code = NotFound desc = could not find container \"4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf\": container with ID starting with 4598336c5b7617228bcf6cd156c66ab183d12ee169a48a9e366dd5f0b274d0bf not found: ID does not exist" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.295726 4772 scope.go:117] "RemoveContainer" containerID="59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d" Jan 27 16:53:33 crc kubenswrapper[4772]: E0127 16:53:33.295960 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d\": container with ID starting with 59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d not found: ID does not exist" containerID="59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d" Jan 27 16:53:33 crc kubenswrapper[4772]: I0127 16:53:33.295989 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d"} err="failed to get container status \"59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d\": rpc error: code = NotFound desc = could not find container \"59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d\": container with ID starting with 59cabea9546ba64b883bf93fa4862411c29e790935ca2098d56582afa6542b6d not found: ID does not exist" Jan 27 16:53:34 crc kubenswrapper[4772]: I0127 16:53:34.680380 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" path="/var/lib/kubelet/pods/e447eca7-2085-4dde-84ff-0e85dccea5e9/volumes" Jan 27 16:53:39 crc kubenswrapper[4772]: I0127 16:53:39.663126 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:53:39 crc kubenswrapper[4772]: E0127 16:53:39.664044 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:53:51 crc kubenswrapper[4772]: I0127 16:53:51.663255 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:53:51 crc kubenswrapper[4772]: E0127 16:53:51.664017 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:53:54 crc kubenswrapper[4772]: I0127 16:53:54.036221 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-fw49r"] Jan 27 16:53:54 crc kubenswrapper[4772]: I0127 16:53:54.045203 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-fw49r"] Jan 27 16:53:54 crc kubenswrapper[4772]: I0127 16:53:54.676204 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6985c580-efad-46fe-8e20-9f932ce3af7d" path="/var/lib/kubelet/pods/6985c580-efad-46fe-8e20-9f932ce3af7d/volumes" Jan 27 16:53:56 crc kubenswrapper[4772]: I0127 16:53:56.036426 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c028-account-create-update-x9k2v"] Jan 27 16:53:56 crc kubenswrapper[4772]: I0127 16:53:56.048955 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c028-account-create-update-x9k2v"] Jan 27 16:53:56 crc kubenswrapper[4772]: I0127 16:53:56.681456 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd343bc-4a97-4ad2-aa82-12ea527398d8" path="/var/lib/kubelet/pods/1cd343bc-4a97-4ad2-aa82-12ea527398d8/volumes" Jan 27 16:54:02 crc kubenswrapper[4772]: I0127 16:54:02.080153 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-4mfnz"] Jan 27 16:54:02 crc kubenswrapper[4772]: I0127 16:54:02.090649 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-4mfnz"] Jan 27 16:54:02 crc kubenswrapper[4772]: I0127 16:54:02.664437 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:54:02 crc kubenswrapper[4772]: E0127 16:54:02.664947 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:54:02 crc kubenswrapper[4772]: I0127 16:54:02.674745 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c203bc37-6216-4aa6-8a51-1e0a2f01bb43" path="/var/lib/kubelet/pods/c203bc37-6216-4aa6-8a51-1e0a2f01bb43/volumes" Jan 27 16:54:03 crc kubenswrapper[4772]: I0127 16:54:03.054428 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-c1c0-account-create-update-cjqxk"] Jan 27 16:54:03 crc kubenswrapper[4772]: I0127 16:54:03.064964 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-c1c0-account-create-update-cjqxk"] Jan 27 16:54:04 crc kubenswrapper[4772]: I0127 16:54:04.681921 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4" path="/var/lib/kubelet/pods/7ff09ddf-04fd-42e9-b6fd-8c9fd9fac0e4/volumes" Jan 27 16:54:09 crc kubenswrapper[4772]: I0127 16:54:09.696109 4772 scope.go:117] "RemoveContainer" containerID="5f81345bb4c2b5ce3e497a908767c4ee00a2cf35d4b6e4a9ea4e1fe6b2891391" Jan 27 16:54:09 crc kubenswrapper[4772]: I0127 16:54:09.728600 4772 scope.go:117] "RemoveContainer" containerID="5ed8b1fb179ec96a663871b1030549de3060884fefe55bd4b363bd01934e9e74" Jan 27 16:54:09 crc kubenswrapper[4772]: I0127 16:54:09.776071 4772 scope.go:117] "RemoveContainer" containerID="ce5afc895546186c87fa545a3ff11c6c821cf7ba305b10a05525fc458e3f4be7" Jan 27 16:54:09 crc kubenswrapper[4772]: I0127 16:54:09.814185 4772 scope.go:117] "RemoveContainer" containerID="742607c6bfa7c1d2a5f8ceee5a81852f3b14a71713e837fcc9210fdc6dccc556" Jan 27 16:54:15 crc kubenswrapper[4772]: I0127 16:54:15.664562 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:54:15 crc kubenswrapper[4772]: E0127 16:54:15.665773 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:54:26 crc kubenswrapper[4772]: I0127 16:54:26.663510 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:54:26 crc kubenswrapper[4772]: E0127 16:54:26.664950 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:54:39 crc kubenswrapper[4772]: I0127 16:54:39.664257 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:54:39 crc kubenswrapper[4772]: E0127 16:54:39.665278 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:54:40 crc kubenswrapper[4772]: I0127 16:54:40.039427 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-bz6q7"] Jan 27 16:54:40 crc kubenswrapper[4772]: I0127 16:54:40.047146 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-bz6q7"] Jan 27 16:54:40 crc kubenswrapper[4772]: I0127 16:54:40.680970 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa" path="/var/lib/kubelet/pods/1152dfc9-a3d1-41a5-92cb-a5a8c481a8fa/volumes" Jan 27 16:54:53 crc kubenswrapper[4772]: I0127 16:54:53.662842 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:54:53 crc kubenswrapper[4772]: E0127 16:54:53.664961 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:55:08 crc kubenswrapper[4772]: I0127 16:55:08.664728 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:55:08 crc kubenswrapper[4772]: E0127 16:55:08.665802 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:55:09 crc kubenswrapper[4772]: I0127 16:55:09.957300 4772 scope.go:117] "RemoveContainer" containerID="908bfdb73b8d3ebb90afef686e407a3a2da1f2ea295c24e132068119ed919b42" Jan 27 16:55:09 crc kubenswrapper[4772]: I0127 16:55:09.995510 4772 scope.go:117] "RemoveContainer" containerID="3c5cbb7ef3f7daca21bab77efa8022a5214525eb8b99dd2483d6689f1db4cb83" Jan 27 16:55:20 crc kubenswrapper[4772]: I0127 16:55:20.664024 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:55:20 crc kubenswrapper[4772]: E0127 16:55:20.665223 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:55:35 crc kubenswrapper[4772]: I0127 16:55:35.663395 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:55:35 crc kubenswrapper[4772]: E0127 16:55:35.664361 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:55:48 crc kubenswrapper[4772]: I0127 16:55:48.663700 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:55:48 crc kubenswrapper[4772]: E0127 16:55:48.664619 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:56:01 crc kubenswrapper[4772]: I0127 16:56:01.663778 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:56:01 crc kubenswrapper[4772]: E0127 16:56:01.664736 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:56:16 crc kubenswrapper[4772]: I0127 16:56:16.663927 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:56:16 crc kubenswrapper[4772]: E0127 16:56:16.665275 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:56:27 crc kubenswrapper[4772]: I0127 16:56:27.664348 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:56:27 crc kubenswrapper[4772]: E0127 16:56:27.665354 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:56:41 crc kubenswrapper[4772]: I0127 16:56:41.663906 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:56:41 crc kubenswrapper[4772]: E0127 16:56:41.664726 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:56:54 crc kubenswrapper[4772]: I0127 16:56:54.662967 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:56:54 crc kubenswrapper[4772]: E0127 16:56:54.663750 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:57:08 crc kubenswrapper[4772]: I0127 16:57:08.663500 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:57:08 crc kubenswrapper[4772]: E0127 16:57:08.664656 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:57:22 crc kubenswrapper[4772]: I0127 16:57:22.664985 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:57:22 crc kubenswrapper[4772]: E0127 16:57:22.666570 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:57:34 crc kubenswrapper[4772]: I0127 16:57:34.669880 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:57:34 crc kubenswrapper[4772]: E0127 16:57:34.670801 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:57:49 crc kubenswrapper[4772]: I0127 16:57:49.663304 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:57:49 crc kubenswrapper[4772]: E0127 16:57:49.664627 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:58:02 crc kubenswrapper[4772]: I0127 16:58:02.663102 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:58:02 crc kubenswrapper[4772]: E0127 16:58:02.664099 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 16:58:15 crc kubenswrapper[4772]: I0127 16:58:15.663855 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.441305 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"0c934f11ffcfb51be7cc650e76f8b239868b5820a22f2783555e83c31ae7ef8b"} Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.637464 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89vmp"] Jan 27 16:58:16 crc kubenswrapper[4772]: E0127 16:58:16.637913 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="extract-utilities" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.637934 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="extract-utilities" Jan 27 16:58:16 crc kubenswrapper[4772]: E0127 16:58:16.637957 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="registry-server" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.637965 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="registry-server" Jan 27 16:58:16 crc kubenswrapper[4772]: E0127 16:58:16.637985 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="extract-content" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.637992 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="extract-content" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.638279 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e447eca7-2085-4dde-84ff-0e85dccea5e9" containerName="registry-server" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.640099 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.722923 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89vmp"] Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.783565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwlb6\" (UniqueName: \"kubernetes.io/projected/f91cc5e4-df6f-47e0-b9ce-795e207a8561-kube-api-access-fwlb6\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.783726 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-utilities\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.783758 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-catalog-content\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.885066 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwlb6\" (UniqueName: \"kubernetes.io/projected/f91cc5e4-df6f-47e0-b9ce-795e207a8561-kube-api-access-fwlb6\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.885186 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-utilities\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.885208 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-catalog-content\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.885667 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-catalog-content\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.885767 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-utilities\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:16 crc kubenswrapper[4772]: I0127 16:58:16.904752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwlb6\" (UniqueName: \"kubernetes.io/projected/f91cc5e4-df6f-47e0-b9ce-795e207a8561-kube-api-access-fwlb6\") pod \"redhat-operators-89vmp\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:17 crc kubenswrapper[4772]: I0127 16:58:17.022506 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:17 crc kubenswrapper[4772]: I0127 16:58:17.527593 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89vmp"] Jan 27 16:58:17 crc kubenswrapper[4772]: W0127 16:58:17.532374 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91cc5e4_df6f_47e0_b9ce_795e207a8561.slice/crio-909dee473f295614475f73b896a6fcc55c3055397f22ce88d988578e94255531 WatchSource:0}: Error finding container 909dee473f295614475f73b896a6fcc55c3055397f22ce88d988578e94255531: Status 404 returned error can't find the container with id 909dee473f295614475f73b896a6fcc55c3055397f22ce88d988578e94255531 Jan 27 16:58:18 crc kubenswrapper[4772]: I0127 16:58:18.458588 4772 generic.go:334] "Generic (PLEG): container finished" podID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerID="8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0" exitCode=0 Jan 27 16:58:18 crc kubenswrapper[4772]: I0127 16:58:18.458705 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerDied","Data":"8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0"} Jan 27 16:58:18 crc kubenswrapper[4772]: I0127 16:58:18.459103 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerStarted","Data":"909dee473f295614475f73b896a6fcc55c3055397f22ce88d988578e94255531"} Jan 27 16:58:18 crc kubenswrapper[4772]: I0127 16:58:18.461576 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:58:19 crc kubenswrapper[4772]: I0127 16:58:19.475996 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerStarted","Data":"8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b"} Jan 27 16:58:20 crc kubenswrapper[4772]: I0127 16:58:20.492210 4772 generic.go:334] "Generic (PLEG): container finished" podID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerID="8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b" exitCode=0 Jan 27 16:58:20 crc kubenswrapper[4772]: I0127 16:58:20.492859 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerDied","Data":"8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b"} Jan 27 16:58:21 crc kubenswrapper[4772]: I0127 16:58:21.505554 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerStarted","Data":"340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816"} Jan 27 16:58:21 crc kubenswrapper[4772]: I0127 16:58:21.534316 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89vmp" podStartSLOduration=3.090597674 podStartE2EDuration="5.534295549s" podCreationTimestamp="2026-01-27 16:58:16 +0000 UTC" firstStartedPulling="2026-01-27 16:58:18.461240808 +0000 UTC m=+6684.441849926" lastFinishedPulling="2026-01-27 16:58:20.904938703 +0000 UTC m=+6686.885547801" observedRunningTime="2026-01-27 16:58:21.525277482 +0000 UTC m=+6687.505886610" watchObservedRunningTime="2026-01-27 16:58:21.534295549 +0000 UTC m=+6687.514904657" Jan 27 16:58:27 crc kubenswrapper[4772]: I0127 16:58:27.023261 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:27 crc kubenswrapper[4772]: I0127 16:58:27.024888 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:28 crc kubenswrapper[4772]: I0127 16:58:28.080995 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-89vmp" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="registry-server" probeResult="failure" output=< Jan 27 16:58:28 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 16:58:28 crc kubenswrapper[4772]: > Jan 27 16:58:37 crc kubenswrapper[4772]: I0127 16:58:37.101686 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:37 crc kubenswrapper[4772]: I0127 16:58:37.182106 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:37 crc kubenswrapper[4772]: I0127 16:58:37.354871 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89vmp"] Jan 27 16:58:38 crc kubenswrapper[4772]: I0127 16:58:38.686976 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89vmp" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="registry-server" containerID="cri-o://340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816" gracePeriod=2 Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.170795 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.296803 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-utilities\") pod \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.296888 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwlb6\" (UniqueName: \"kubernetes.io/projected/f91cc5e4-df6f-47e0-b9ce-795e207a8561-kube-api-access-fwlb6\") pod \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.296950 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-catalog-content\") pod \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\" (UID: \"f91cc5e4-df6f-47e0-b9ce-795e207a8561\") " Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.298649 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-utilities" (OuterVolumeSpecName: "utilities") pod "f91cc5e4-df6f-47e0-b9ce-795e207a8561" (UID: "f91cc5e4-df6f-47e0-b9ce-795e207a8561"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.303237 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f91cc5e4-df6f-47e0-b9ce-795e207a8561-kube-api-access-fwlb6" (OuterVolumeSpecName: "kube-api-access-fwlb6") pod "f91cc5e4-df6f-47e0-b9ce-795e207a8561" (UID: "f91cc5e4-df6f-47e0-b9ce-795e207a8561"). InnerVolumeSpecName "kube-api-access-fwlb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.400084 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.400149 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwlb6\" (UniqueName: \"kubernetes.io/projected/f91cc5e4-df6f-47e0-b9ce-795e207a8561-kube-api-access-fwlb6\") on node \"crc\" DevicePath \"\"" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.425256 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f91cc5e4-df6f-47e0-b9ce-795e207a8561" (UID: "f91cc5e4-df6f-47e0-b9ce-795e207a8561"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.502505 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f91cc5e4-df6f-47e0-b9ce-795e207a8561-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.723158 4772 generic.go:334] "Generic (PLEG): container finished" podID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerID="340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816" exitCode=0 Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.723246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerDied","Data":"340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816"} Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.723295 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89vmp" event={"ID":"f91cc5e4-df6f-47e0-b9ce-795e207a8561","Type":"ContainerDied","Data":"909dee473f295614475f73b896a6fcc55c3055397f22ce88d988578e94255531"} Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.723310 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89vmp" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.723325 4772 scope.go:117] "RemoveContainer" containerID="340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.772642 4772 scope.go:117] "RemoveContainer" containerID="8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.777560 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89vmp"] Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.789753 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89vmp"] Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.797779 4772 scope.go:117] "RemoveContainer" containerID="8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.844446 4772 scope.go:117] "RemoveContainer" containerID="340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816" Jan 27 16:58:39 crc kubenswrapper[4772]: E0127 16:58:39.844860 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816\": container with ID starting with 340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816 not found: ID does not exist" containerID="340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.844906 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816"} err="failed to get container status \"340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816\": rpc error: code = NotFound desc = could not find container \"340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816\": container with ID starting with 340c878f6ca56d802bba3e1f2d501805058c80c44e6d3a585e2176c49b0e4816 not found: ID does not exist" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.844931 4772 scope.go:117] "RemoveContainer" containerID="8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b" Jan 27 16:58:39 crc kubenswrapper[4772]: E0127 16:58:39.845197 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b\": container with ID starting with 8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b not found: ID does not exist" containerID="8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.845218 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b"} err="failed to get container status \"8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b\": rpc error: code = NotFound desc = could not find container \"8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b\": container with ID starting with 8ce2ec520d6e47d51f04b217aa2ebed5a969d9d59a7c977fcf91c69467c21d2b not found: ID does not exist" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.845231 4772 scope.go:117] "RemoveContainer" containerID="8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0" Jan 27 16:58:39 crc kubenswrapper[4772]: E0127 16:58:39.845430 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0\": container with ID starting with 8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0 not found: ID does not exist" containerID="8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0" Jan 27 16:58:39 crc kubenswrapper[4772]: I0127 16:58:39.845455 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0"} err="failed to get container status \"8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0\": rpc error: code = NotFound desc = could not find container \"8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0\": container with ID starting with 8b5be62588d532a863b7d2cb4a30bdd9901e4a378dd25611d18afdd47f45e4a0 not found: ID does not exist" Jan 27 16:58:40 crc kubenswrapper[4772]: I0127 16:58:40.683517 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" path="/var/lib/kubelet/pods/f91cc5e4-df6f-47e0-b9ce-795e207a8561/volumes" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.171148 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps"] Jan 27 17:00:00 crc kubenswrapper[4772]: E0127 17:00:00.172270 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="extract-content" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.172288 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="extract-content" Jan 27 17:00:00 crc kubenswrapper[4772]: E0127 17:00:00.172322 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="extract-utilities" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.172331 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="extract-utilities" Jan 27 17:00:00 crc kubenswrapper[4772]: E0127 17:00:00.172355 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="registry-server" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.172364 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="registry-server" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.172593 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f91cc5e4-df6f-47e0-b9ce-795e207a8561" containerName="registry-server" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.173409 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.176453 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.181387 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.198308 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps"] Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.302520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73038a7f-6c26-47b7-ad06-bd235e268224-secret-volume\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.302653 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mc69\" (UniqueName: \"kubernetes.io/projected/73038a7f-6c26-47b7-ad06-bd235e268224-kube-api-access-6mc69\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.302682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73038a7f-6c26-47b7-ad06-bd235e268224-config-volume\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.404746 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73038a7f-6c26-47b7-ad06-bd235e268224-secret-volume\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.404845 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mc69\" (UniqueName: \"kubernetes.io/projected/73038a7f-6c26-47b7-ad06-bd235e268224-kube-api-access-6mc69\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.404864 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73038a7f-6c26-47b7-ad06-bd235e268224-config-volume\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.405782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73038a7f-6c26-47b7-ad06-bd235e268224-config-volume\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.412946 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73038a7f-6c26-47b7-ad06-bd235e268224-secret-volume\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.425867 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mc69\" (UniqueName: \"kubernetes.io/projected/73038a7f-6c26-47b7-ad06-bd235e268224-kube-api-access-6mc69\") pod \"collect-profiles-29492220-sqfps\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.500637 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:00 crc kubenswrapper[4772]: I0127 17:00:00.975926 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps"] Jan 27 17:00:01 crc kubenswrapper[4772]: I0127 17:00:01.556920 4772 generic.go:334] "Generic (PLEG): container finished" podID="73038a7f-6c26-47b7-ad06-bd235e268224" containerID="21fbf772d614ea3a35cb7d6244635ba0574a7b4a610726fba6575e200d3d3209" exitCode=0 Jan 27 17:00:01 crc kubenswrapper[4772]: I0127 17:00:01.556967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" event={"ID":"73038a7f-6c26-47b7-ad06-bd235e268224","Type":"ContainerDied","Data":"21fbf772d614ea3a35cb7d6244635ba0574a7b4a610726fba6575e200d3d3209"} Jan 27 17:00:01 crc kubenswrapper[4772]: I0127 17:00:01.557270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" event={"ID":"73038a7f-6c26-47b7-ad06-bd235e268224","Type":"ContainerStarted","Data":"c6cd8a12b017031e00e5f372168cdf699372dbce305414437676d1d6580430fc"} Jan 27 17:00:02 crc kubenswrapper[4772]: I0127 17:00:02.934013 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.055568 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mc69\" (UniqueName: \"kubernetes.io/projected/73038a7f-6c26-47b7-ad06-bd235e268224-kube-api-access-6mc69\") pod \"73038a7f-6c26-47b7-ad06-bd235e268224\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.055684 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73038a7f-6c26-47b7-ad06-bd235e268224-config-volume\") pod \"73038a7f-6c26-47b7-ad06-bd235e268224\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.055824 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73038a7f-6c26-47b7-ad06-bd235e268224-secret-volume\") pod \"73038a7f-6c26-47b7-ad06-bd235e268224\" (UID: \"73038a7f-6c26-47b7-ad06-bd235e268224\") " Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.056369 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73038a7f-6c26-47b7-ad06-bd235e268224-config-volume" (OuterVolumeSpecName: "config-volume") pod "73038a7f-6c26-47b7-ad06-bd235e268224" (UID: "73038a7f-6c26-47b7-ad06-bd235e268224"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.061359 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73038a7f-6c26-47b7-ad06-bd235e268224-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73038a7f-6c26-47b7-ad06-bd235e268224" (UID: "73038a7f-6c26-47b7-ad06-bd235e268224"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.061388 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73038a7f-6c26-47b7-ad06-bd235e268224-kube-api-access-6mc69" (OuterVolumeSpecName: "kube-api-access-6mc69") pod "73038a7f-6c26-47b7-ad06-bd235e268224" (UID: "73038a7f-6c26-47b7-ad06-bd235e268224"). InnerVolumeSpecName "kube-api-access-6mc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.159156 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73038a7f-6c26-47b7-ad06-bd235e268224-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.159248 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mc69\" (UniqueName: \"kubernetes.io/projected/73038a7f-6c26-47b7-ad06-bd235e268224-kube-api-access-6mc69\") on node \"crc\" DevicePath \"\"" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.159268 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73038a7f-6c26-47b7-ad06-bd235e268224-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.578212 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" event={"ID":"73038a7f-6c26-47b7-ad06-bd235e268224","Type":"ContainerDied","Data":"c6cd8a12b017031e00e5f372168cdf699372dbce305414437676d1d6580430fc"} Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.578263 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6cd8a12b017031e00e5f372168cdf699372dbce305414437676d1d6580430fc" Jan 27 17:00:03 crc kubenswrapper[4772]: I0127 17:00:03.578292 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps" Jan 27 17:00:04 crc kubenswrapper[4772]: I0127 17:00:04.034655 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4"] Jan 27 17:00:04 crc kubenswrapper[4772]: I0127 17:00:04.046691 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492175-fg6x4"] Jan 27 17:00:04 crc kubenswrapper[4772]: I0127 17:00:04.683433 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be908ee-6173-4ee8-80c4-0738697898d2" path="/var/lib/kubelet/pods/0be908ee-6173-4ee8-80c4-0738697898d2/volumes" Jan 27 17:00:10 crc kubenswrapper[4772]: I0127 17:00:10.209706 4772 scope.go:117] "RemoveContainer" containerID="d013dea461e279e8b861558e82f04a509da66ccae91eabf32103d04803eb33bd" Jan 27 17:00:42 crc kubenswrapper[4772]: I0127 17:00:42.058960 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:00:42 crc kubenswrapper[4772]: I0127 17:00:42.059964 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.070849 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:00:50 crc kubenswrapper[4772]: E0127 17:00:50.072073 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73038a7f-6c26-47b7-ad06-bd235e268224" containerName="collect-profiles" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.072094 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="73038a7f-6c26-47b7-ad06-bd235e268224" containerName="collect-profiles" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.072367 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="73038a7f-6c26-47b7-ad06-bd235e268224" containerName="collect-profiles" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.074262 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.084095 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.116337 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhfw\" (UniqueName: \"kubernetes.io/projected/10eef819-4355-4b65-bb81-95c055327034-kube-api-access-tkhfw\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.116378 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-utilities\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.116403 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-catalog-content\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.217658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhfw\" (UniqueName: \"kubernetes.io/projected/10eef819-4355-4b65-bb81-95c055327034-kube-api-access-tkhfw\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.217720 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-utilities\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.217749 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-catalog-content\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.218210 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-utilities\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.218255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-catalog-content\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.240087 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhfw\" (UniqueName: \"kubernetes.io/projected/10eef819-4355-4b65-bb81-95c055327034-kube-api-access-tkhfw\") pod \"community-operators-txfct\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.437309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:00:50 crc kubenswrapper[4772]: I0127 17:00:50.948546 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:00:50 crc kubenswrapper[4772]: W0127 17:00:50.951654 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10eef819_4355_4b65_bb81_95c055327034.slice/crio-23557e0504a3fe336145be5a361937c216f6e8cad04bfe40c3131be90c233123 WatchSource:0}: Error finding container 23557e0504a3fe336145be5a361937c216f6e8cad04bfe40c3131be90c233123: Status 404 returned error can't find the container with id 23557e0504a3fe336145be5a361937c216f6e8cad04bfe40c3131be90c233123 Jan 27 17:00:51 crc kubenswrapper[4772]: I0127 17:00:51.128123 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerStarted","Data":"23557e0504a3fe336145be5a361937c216f6e8cad04bfe40c3131be90c233123"} Jan 27 17:00:52 crc kubenswrapper[4772]: I0127 17:00:52.136759 4772 generic.go:334] "Generic (PLEG): container finished" podID="10eef819-4355-4b65-bb81-95c055327034" containerID="7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c" exitCode=0 Jan 27 17:00:52 crc kubenswrapper[4772]: I0127 17:00:52.136858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerDied","Data":"7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c"} Jan 27 17:00:56 crc kubenswrapper[4772]: I0127 17:00:56.177069 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerStarted","Data":"490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605"} Jan 27 17:00:57 crc kubenswrapper[4772]: I0127 17:00:57.188048 4772 generic.go:334] "Generic (PLEG): container finished" podID="10eef819-4355-4b65-bb81-95c055327034" containerID="490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605" exitCode=0 Jan 27 17:00:57 crc kubenswrapper[4772]: I0127 17:00:57.188099 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerDied","Data":"490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605"} Jan 27 17:00:58 crc kubenswrapper[4772]: I0127 17:00:58.200132 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerStarted","Data":"114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93"} Jan 27 17:00:58 crc kubenswrapper[4772]: I0127 17:00:58.226149 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-txfct" podStartSLOduration=2.680170077 podStartE2EDuration="8.226126016s" podCreationTimestamp="2026-01-27 17:00:50 +0000 UTC" firstStartedPulling="2026-01-27 17:00:52.1384179 +0000 UTC m=+6838.119027018" lastFinishedPulling="2026-01-27 17:00:57.684373849 +0000 UTC m=+6843.664982957" observedRunningTime="2026-01-27 17:00:58.221127443 +0000 UTC m=+6844.201736551" watchObservedRunningTime="2026-01-27 17:00:58.226126016 +0000 UTC m=+6844.206735104" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.169643 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29492221-sbxjp"] Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.172419 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.194787 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29492221-sbxjp"] Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.235974 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnldn\" (UniqueName: \"kubernetes.io/projected/8459d055-62d3-4699-b477-ea15946b982c-kube-api-access-qnldn\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.236051 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-config-data\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.236098 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-combined-ca-bundle\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.236154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-fernet-keys\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.337997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-combined-ca-bundle\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.338482 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-fernet-keys\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.338704 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnldn\" (UniqueName: \"kubernetes.io/projected/8459d055-62d3-4699-b477-ea15946b982c-kube-api-access-qnldn\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.338879 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-config-data\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.344587 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-combined-ca-bundle\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.344781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-config-data\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.346110 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-fernet-keys\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.355074 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnldn\" (UniqueName: \"kubernetes.io/projected/8459d055-62d3-4699-b477-ea15946b982c-kube-api-access-qnldn\") pod \"keystone-cron-29492221-sbxjp\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.438057 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.438124 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.494504 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:00 crc kubenswrapper[4772]: I0127 17:01:00.501932 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:01:01 crc kubenswrapper[4772]: I0127 17:01:01.271342 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29492221-sbxjp"] Jan 27 17:01:01 crc kubenswrapper[4772]: W0127 17:01:01.286650 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8459d055_62d3_4699_b477_ea15946b982c.slice/crio-9e569490788ee5825687fe890769c82986f397a62be24308cfba122d04118a20 WatchSource:0}: Error finding container 9e569490788ee5825687fe890769c82986f397a62be24308cfba122d04118a20: Status 404 returned error can't find the container with id 9e569490788ee5825687fe890769c82986f397a62be24308cfba122d04118a20 Jan 27 17:01:02 crc kubenswrapper[4772]: I0127 17:01:02.244097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29492221-sbxjp" event={"ID":"8459d055-62d3-4699-b477-ea15946b982c","Type":"ContainerStarted","Data":"9e569490788ee5825687fe890769c82986f397a62be24308cfba122d04118a20"} Jan 27 17:01:05 crc kubenswrapper[4772]: I0127 17:01:05.274449 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29492221-sbxjp" event={"ID":"8459d055-62d3-4699-b477-ea15946b982c","Type":"ContainerStarted","Data":"4cd94d61ef219342980e5481c59103d0260264e8f26556ff01423fdd9d5863ee"} Jan 27 17:01:07 crc kubenswrapper[4772]: I0127 17:01:07.302238 4772 generic.go:334] "Generic (PLEG): container finished" podID="8459d055-62d3-4699-b477-ea15946b982c" containerID="4cd94d61ef219342980e5481c59103d0260264e8f26556ff01423fdd9d5863ee" exitCode=0 Jan 27 17:01:07 crc kubenswrapper[4772]: I0127 17:01:07.302344 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29492221-sbxjp" event={"ID":"8459d055-62d3-4699-b477-ea15946b982c","Type":"ContainerDied","Data":"4cd94d61ef219342980e5481c59103d0260264e8f26556ff01423fdd9d5863ee"} Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.655383 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.750777 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-fernet-keys\") pod \"8459d055-62d3-4699-b477-ea15946b982c\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.750872 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-config-data\") pod \"8459d055-62d3-4699-b477-ea15946b982c\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.751043 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnldn\" (UniqueName: \"kubernetes.io/projected/8459d055-62d3-4699-b477-ea15946b982c-kube-api-access-qnldn\") pod \"8459d055-62d3-4699-b477-ea15946b982c\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.751147 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-combined-ca-bundle\") pod \"8459d055-62d3-4699-b477-ea15946b982c\" (UID: \"8459d055-62d3-4699-b477-ea15946b982c\") " Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.757490 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8459d055-62d3-4699-b477-ea15946b982c" (UID: "8459d055-62d3-4699-b477-ea15946b982c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.758697 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8459d055-62d3-4699-b477-ea15946b982c-kube-api-access-qnldn" (OuterVolumeSpecName: "kube-api-access-qnldn") pod "8459d055-62d3-4699-b477-ea15946b982c" (UID: "8459d055-62d3-4699-b477-ea15946b982c"). InnerVolumeSpecName "kube-api-access-qnldn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.779499 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8459d055-62d3-4699-b477-ea15946b982c" (UID: "8459d055-62d3-4699-b477-ea15946b982c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.804920 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-config-data" (OuterVolumeSpecName: "config-data") pod "8459d055-62d3-4699-b477-ea15946b982c" (UID: "8459d055-62d3-4699-b477-ea15946b982c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.854109 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnldn\" (UniqueName: \"kubernetes.io/projected/8459d055-62d3-4699-b477-ea15946b982c-kube-api-access-qnldn\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.854436 4772 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.854451 4772 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:08 crc kubenswrapper[4772]: I0127 17:01:08.854464 4772 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8459d055-62d3-4699-b477-ea15946b982c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:09 crc kubenswrapper[4772]: I0127 17:01:09.322418 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29492221-sbxjp" event={"ID":"8459d055-62d3-4699-b477-ea15946b982c","Type":"ContainerDied","Data":"9e569490788ee5825687fe890769c82986f397a62be24308cfba122d04118a20"} Jan 27 17:01:09 crc kubenswrapper[4772]: I0127 17:01:09.322481 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e569490788ee5825687fe890769c82986f397a62be24308cfba122d04118a20" Jan 27 17:01:09 crc kubenswrapper[4772]: I0127 17:01:09.322482 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29492221-sbxjp" Jan 27 17:01:10 crc kubenswrapper[4772]: I0127 17:01:10.554725 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:01:10 crc kubenswrapper[4772]: I0127 17:01:10.619293 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:01:10 crc kubenswrapper[4772]: I0127 17:01:10.661685 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4vbvz"] Jan 27 17:01:10 crc kubenswrapper[4772]: I0127 17:01:10.661903 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4vbvz" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="registry-server" containerID="cri-o://facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f" gracePeriod=2 Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.152290 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.304301 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-utilities\") pod \"20e4371a-8bd2-4405-bb18-861923bfd37e\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.304442 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-catalog-content\") pod \"20e4371a-8bd2-4405-bb18-861923bfd37e\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.304656 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csq5c\" (UniqueName: \"kubernetes.io/projected/20e4371a-8bd2-4405-bb18-861923bfd37e-kube-api-access-csq5c\") pod \"20e4371a-8bd2-4405-bb18-861923bfd37e\" (UID: \"20e4371a-8bd2-4405-bb18-861923bfd37e\") " Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.306555 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-utilities" (OuterVolumeSpecName: "utilities") pod "20e4371a-8bd2-4405-bb18-861923bfd37e" (UID: "20e4371a-8bd2-4405-bb18-861923bfd37e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.328370 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e4371a-8bd2-4405-bb18-861923bfd37e-kube-api-access-csq5c" (OuterVolumeSpecName: "kube-api-access-csq5c") pod "20e4371a-8bd2-4405-bb18-861923bfd37e" (UID: "20e4371a-8bd2-4405-bb18-861923bfd37e"). InnerVolumeSpecName "kube-api-access-csq5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.345268 4772 generic.go:334] "Generic (PLEG): container finished" podID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerID="facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f" exitCode=0 Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.345357 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4vbvz" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.345433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vbvz" event={"ID":"20e4371a-8bd2-4405-bb18-861923bfd37e","Type":"ContainerDied","Data":"facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f"} Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.345550 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4vbvz" event={"ID":"20e4371a-8bd2-4405-bb18-861923bfd37e","Type":"ContainerDied","Data":"f541cf85cdae422e49f9b3df32c8dd0416ecc007711177dc8c1d7d7680929e43"} Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.346141 4772 scope.go:117] "RemoveContainer" containerID="facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.407766 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.407808 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csq5c\" (UniqueName: \"kubernetes.io/projected/20e4371a-8bd2-4405-bb18-861923bfd37e-kube-api-access-csq5c\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.416054 4772 scope.go:117] "RemoveContainer" containerID="3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.421621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20e4371a-8bd2-4405-bb18-861923bfd37e" (UID: "20e4371a-8bd2-4405-bb18-861923bfd37e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.469690 4772 scope.go:117] "RemoveContainer" containerID="a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.503252 4772 scope.go:117] "RemoveContainer" containerID="facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f" Jan 27 17:01:11 crc kubenswrapper[4772]: E0127 17:01:11.503777 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f\": container with ID starting with facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f not found: ID does not exist" containerID="facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.503820 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f"} err="failed to get container status \"facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f\": rpc error: code = NotFound desc = could not find container \"facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f\": container with ID starting with facc4381b6ccbc800e2ee9d053a4ca1a05dc9ad0f250dcd95e8e1ea0e652318f not found: ID does not exist" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.503844 4772 scope.go:117] "RemoveContainer" containerID="3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c" Jan 27 17:01:11 crc kubenswrapper[4772]: E0127 17:01:11.504139 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c\": container with ID starting with 3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c not found: ID does not exist" containerID="3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.504176 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c"} err="failed to get container status \"3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c\": rpc error: code = NotFound desc = could not find container \"3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c\": container with ID starting with 3e6786fd605112a5af975912e2a7080680ebac9d9bf41ad3486e573e6d6d2f4c not found: ID does not exist" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.504190 4772 scope.go:117] "RemoveContainer" containerID="a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e" Jan 27 17:01:11 crc kubenswrapper[4772]: E0127 17:01:11.504454 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e\": container with ID starting with a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e not found: ID does not exist" containerID="a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.504526 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e"} err="failed to get container status \"a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e\": rpc error: code = NotFound desc = could not find container \"a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e\": container with ID starting with a913cea196ab4b53872fe2376a7200e2beda9ee644b2571a89e275314690904e not found: ID does not exist" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.509780 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e4371a-8bd2-4405-bb18-861923bfd37e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.678480 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4vbvz"] Jan 27 17:01:11 crc kubenswrapper[4772]: I0127 17:01:11.688879 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4vbvz"] Jan 27 17:01:12 crc kubenswrapper[4772]: I0127 17:01:12.058272 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:01:12 crc kubenswrapper[4772]: I0127 17:01:12.058341 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:01:12 crc kubenswrapper[4772]: I0127 17:01:12.687441 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" path="/var/lib/kubelet/pods/20e4371a-8bd2-4405-bb18-861923bfd37e/volumes" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.747131 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2ph"] Jan 27 17:01:32 crc kubenswrapper[4772]: E0127 17:01:32.751012 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="extract-utilities" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.751204 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="extract-utilities" Jan 27 17:01:32 crc kubenswrapper[4772]: E0127 17:01:32.751331 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="registry-server" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.751419 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="registry-server" Jan 27 17:01:32 crc kubenswrapper[4772]: E0127 17:01:32.751529 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="extract-content" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.751704 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="extract-content" Jan 27 17:01:32 crc kubenswrapper[4772]: E0127 17:01:32.751881 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8459d055-62d3-4699-b477-ea15946b982c" containerName="keystone-cron" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.751989 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8459d055-62d3-4699-b477-ea15946b982c" containerName="keystone-cron" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.752439 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e4371a-8bd2-4405-bb18-861923bfd37e" containerName="registry-server" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.752788 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8459d055-62d3-4699-b477-ea15946b982c" containerName="keystone-cron" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.754941 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.778327 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2ph"] Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.905451 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-catalog-content\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.905818 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-utilities\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:32 crc kubenswrapper[4772]: I0127 17:01:32.905881 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgn7d\" (UniqueName: \"kubernetes.io/projected/5b18dbed-da5b-4cc5-adff-4ea91a19d097-kube-api-access-mgn7d\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.007262 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-catalog-content\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.007447 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-utilities\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.007474 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgn7d\" (UniqueName: \"kubernetes.io/projected/5b18dbed-da5b-4cc5-adff-4ea91a19d097-kube-api-access-mgn7d\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.007807 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-catalog-content\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.007900 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-utilities\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.027181 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgn7d\" (UniqueName: \"kubernetes.io/projected/5b18dbed-da5b-4cc5-adff-4ea91a19d097-kube-api-access-mgn7d\") pod \"redhat-marketplace-wt2ph\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.089876 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:33 crc kubenswrapper[4772]: I0127 17:01:33.605290 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2ph"] Jan 27 17:01:34 crc kubenswrapper[4772]: I0127 17:01:34.604108 4772 generic.go:334] "Generic (PLEG): container finished" podID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerID="2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4" exitCode=0 Jan 27 17:01:34 crc kubenswrapper[4772]: I0127 17:01:34.604157 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2ph" event={"ID":"5b18dbed-da5b-4cc5-adff-4ea91a19d097","Type":"ContainerDied","Data":"2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4"} Jan 27 17:01:34 crc kubenswrapper[4772]: I0127 17:01:34.604486 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2ph" event={"ID":"5b18dbed-da5b-4cc5-adff-4ea91a19d097","Type":"ContainerStarted","Data":"acdeb111e40eb59f1abc80768c71ed2578e0a0139b01996558c0b753b7b98050"} Jan 27 17:01:36 crc kubenswrapper[4772]: I0127 17:01:36.630401 4772 generic.go:334] "Generic (PLEG): container finished" podID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerID="c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b" exitCode=0 Jan 27 17:01:36 crc kubenswrapper[4772]: I0127 17:01:36.630547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2ph" event={"ID":"5b18dbed-da5b-4cc5-adff-4ea91a19d097","Type":"ContainerDied","Data":"c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b"} Jan 27 17:01:37 crc kubenswrapper[4772]: I0127 17:01:37.645628 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2ph" event={"ID":"5b18dbed-da5b-4cc5-adff-4ea91a19d097","Type":"ContainerStarted","Data":"e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139"} Jan 27 17:01:37 crc kubenswrapper[4772]: I0127 17:01:37.677979 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wt2ph" podStartSLOduration=2.981748966 podStartE2EDuration="5.67794794s" podCreationTimestamp="2026-01-27 17:01:32 +0000 UTC" firstStartedPulling="2026-01-27 17:01:34.607997285 +0000 UTC m=+6880.588606383" lastFinishedPulling="2026-01-27 17:01:37.304196219 +0000 UTC m=+6883.284805357" observedRunningTime="2026-01-27 17:01:37.671849105 +0000 UTC m=+6883.652458243" watchObservedRunningTime="2026-01-27 17:01:37.67794794 +0000 UTC m=+6883.658557078" Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.058800 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.059478 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.059546 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.060419 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c934f11ffcfb51be7cc650e76f8b239868b5820a22f2783555e83c31ae7ef8b"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.060498 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://0c934f11ffcfb51be7cc650e76f8b239868b5820a22f2783555e83c31ae7ef8b" gracePeriod=600 Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.705139 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="0c934f11ffcfb51be7cc650e76f8b239868b5820a22f2783555e83c31ae7ef8b" exitCode=0 Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.705237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"0c934f11ffcfb51be7cc650e76f8b239868b5820a22f2783555e83c31ae7ef8b"} Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.705467 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7"} Jan 27 17:01:42 crc kubenswrapper[4772]: I0127 17:01:42.705490 4772 scope.go:117] "RemoveContainer" containerID="20d6f541f5f3fc25f3b782ee3a329f52bc226dbfbe21a2a0bc1c99608d7d2317" Jan 27 17:01:43 crc kubenswrapper[4772]: I0127 17:01:43.090342 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:43 crc kubenswrapper[4772]: I0127 17:01:43.090408 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:43 crc kubenswrapper[4772]: I0127 17:01:43.150535 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:43 crc kubenswrapper[4772]: I0127 17:01:43.796958 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:43 crc kubenswrapper[4772]: I0127 17:01:43.848863 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2ph"] Jan 27 17:01:45 crc kubenswrapper[4772]: I0127 17:01:45.761752 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wt2ph" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="registry-server" containerID="cri-o://e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139" gracePeriod=2 Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.275128 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.455668 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgn7d\" (UniqueName: \"kubernetes.io/projected/5b18dbed-da5b-4cc5-adff-4ea91a19d097-kube-api-access-mgn7d\") pod \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.456227 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-catalog-content\") pod \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.456511 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-utilities\") pod \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\" (UID: \"5b18dbed-da5b-4cc5-adff-4ea91a19d097\") " Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.457891 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-utilities" (OuterVolumeSpecName: "utilities") pod "5b18dbed-da5b-4cc5-adff-4ea91a19d097" (UID: "5b18dbed-da5b-4cc5-adff-4ea91a19d097"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.464924 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b18dbed-da5b-4cc5-adff-4ea91a19d097-kube-api-access-mgn7d" (OuterVolumeSpecName: "kube-api-access-mgn7d") pod "5b18dbed-da5b-4cc5-adff-4ea91a19d097" (UID: "5b18dbed-da5b-4cc5-adff-4ea91a19d097"). InnerVolumeSpecName "kube-api-access-mgn7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.487888 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b18dbed-da5b-4cc5-adff-4ea91a19d097" (UID: "5b18dbed-da5b-4cc5-adff-4ea91a19d097"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.560479 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.560537 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgn7d\" (UniqueName: \"kubernetes.io/projected/5b18dbed-da5b-4cc5-adff-4ea91a19d097-kube-api-access-mgn7d\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.560559 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b18dbed-da5b-4cc5-adff-4ea91a19d097-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.778591 4772 generic.go:334] "Generic (PLEG): container finished" podID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerID="e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139" exitCode=0 Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.778662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2ph" event={"ID":"5b18dbed-da5b-4cc5-adff-4ea91a19d097","Type":"ContainerDied","Data":"e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139"} Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.778707 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wt2ph" event={"ID":"5b18dbed-da5b-4cc5-adff-4ea91a19d097","Type":"ContainerDied","Data":"acdeb111e40eb59f1abc80768c71ed2578e0a0139b01996558c0b753b7b98050"} Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.778744 4772 scope.go:117] "RemoveContainer" containerID="e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.778979 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wt2ph" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.822607 4772 scope.go:117] "RemoveContainer" containerID="c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.832962 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2ph"] Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.842418 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wt2ph"] Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.856310 4772 scope.go:117] "RemoveContainer" containerID="2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.894303 4772 scope.go:117] "RemoveContainer" containerID="e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139" Jan 27 17:01:46 crc kubenswrapper[4772]: E0127 17:01:46.894782 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139\": container with ID starting with e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139 not found: ID does not exist" containerID="e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.894903 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139"} err="failed to get container status \"e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139\": rpc error: code = NotFound desc = could not find container \"e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139\": container with ID starting with e79dd4d96df89caf0c641fb436c853eeaebb87cdbc1ffe88dfe6a8b470494139 not found: ID does not exist" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.894938 4772 scope.go:117] "RemoveContainer" containerID="c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b" Jan 27 17:01:46 crc kubenswrapper[4772]: E0127 17:01:46.895321 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b\": container with ID starting with c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b not found: ID does not exist" containerID="c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.895356 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b"} err="failed to get container status \"c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b\": rpc error: code = NotFound desc = could not find container \"c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b\": container with ID starting with c7710f16831fa2297e6c94f73e15418fba26c2b35e5006130588a2939a816e3b not found: ID does not exist" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.895374 4772 scope.go:117] "RemoveContainer" containerID="2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4" Jan 27 17:01:46 crc kubenswrapper[4772]: E0127 17:01:46.895654 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4\": container with ID starting with 2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4 not found: ID does not exist" containerID="2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4" Jan 27 17:01:46 crc kubenswrapper[4772]: I0127 17:01:46.895682 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4"} err="failed to get container status \"2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4\": rpc error: code = NotFound desc = could not find container \"2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4\": container with ID starting with 2bada1bcab38f9662557d5435a8e993461513d6d69ba69f3f231aad6221846b4 not found: ID does not exist" Jan 27 17:01:48 crc kubenswrapper[4772]: I0127 17:01:48.676402 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" path="/var/lib/kubelet/pods/5b18dbed-da5b-4cc5-adff-4ea91a19d097/volumes" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.493275 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b5nn7"] Jan 27 17:03:20 crc kubenswrapper[4772]: E0127 17:03:20.498973 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="registry-server" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.499286 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="registry-server" Jan 27 17:03:20 crc kubenswrapper[4772]: E0127 17:03:20.499324 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="extract-utilities" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.499333 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="extract-utilities" Jan 27 17:03:20 crc kubenswrapper[4772]: E0127 17:03:20.499360 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="extract-content" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.499369 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="extract-content" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.499618 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b18dbed-da5b-4cc5-adff-4ea91a19d097" containerName="registry-server" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.501271 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.506451 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5nn7"] Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.581695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq22p\" (UniqueName: \"kubernetes.io/projected/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-kube-api-access-nq22p\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.582104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-utilities\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.582453 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-catalog-content\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.684843 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq22p\" (UniqueName: \"kubernetes.io/projected/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-kube-api-access-nq22p\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.684924 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-utilities\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.684960 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-catalog-content\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.685446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-catalog-content\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.685459 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-utilities\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.703566 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq22p\" (UniqueName: \"kubernetes.io/projected/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-kube-api-access-nq22p\") pod \"certified-operators-b5nn7\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:20 crc kubenswrapper[4772]: I0127 17:03:20.875870 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:21 crc kubenswrapper[4772]: I0127 17:03:21.452034 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b5nn7"] Jan 27 17:03:21 crc kubenswrapper[4772]: I0127 17:03:21.851248 4772 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerID="fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402" exitCode=0 Jan 27 17:03:21 crc kubenswrapper[4772]: I0127 17:03:21.851320 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nn7" event={"ID":"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a","Type":"ContainerDied","Data":"fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402"} Jan 27 17:03:21 crc kubenswrapper[4772]: I0127 17:03:21.851358 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nn7" event={"ID":"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a","Type":"ContainerStarted","Data":"868827379c2358104fc3ee3fd215040de500d1cf0eddf9c817b9a3388c61a59b"} Jan 27 17:03:21 crc kubenswrapper[4772]: I0127 17:03:21.855092 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:03:23 crc kubenswrapper[4772]: I0127 17:03:23.878198 4772 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerID="b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af" exitCode=0 Jan 27 17:03:23 crc kubenswrapper[4772]: I0127 17:03:23.878316 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nn7" event={"ID":"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a","Type":"ContainerDied","Data":"b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af"} Jan 27 17:03:24 crc kubenswrapper[4772]: I0127 17:03:24.895680 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nn7" event={"ID":"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a","Type":"ContainerStarted","Data":"446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e"} Jan 27 17:03:24 crc kubenswrapper[4772]: I0127 17:03:24.919124 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b5nn7" podStartSLOduration=2.4662688790000002 podStartE2EDuration="4.919104964s" podCreationTimestamp="2026-01-27 17:03:20 +0000 UTC" firstStartedPulling="2026-01-27 17:03:21.854648298 +0000 UTC m=+6987.835257416" lastFinishedPulling="2026-01-27 17:03:24.307484383 +0000 UTC m=+6990.288093501" observedRunningTime="2026-01-27 17:03:24.917134818 +0000 UTC m=+6990.897743916" watchObservedRunningTime="2026-01-27 17:03:24.919104964 +0000 UTC m=+6990.899714062" Jan 27 17:03:30 crc kubenswrapper[4772]: I0127 17:03:30.876699 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:30 crc kubenswrapper[4772]: I0127 17:03:30.878357 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:30 crc kubenswrapper[4772]: I0127 17:03:30.956823 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:31 crc kubenswrapper[4772]: I0127 17:03:31.009948 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:31 crc kubenswrapper[4772]: I0127 17:03:31.206259 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5nn7"] Jan 27 17:03:32 crc kubenswrapper[4772]: I0127 17:03:32.990423 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b5nn7" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="registry-server" containerID="cri-o://446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e" gracePeriod=2 Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.550098 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.568485 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-catalog-content\") pod \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.569488 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-utilities\") pod \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.570255 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq22p\" (UniqueName: \"kubernetes.io/projected/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-kube-api-access-nq22p\") pod \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\" (UID: \"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a\") " Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.570710 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-utilities" (OuterVolumeSpecName: "utilities") pod "a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" (UID: "a6d8c5f6-5840-4996-8edf-b0cdd3ab833a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.576605 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.586508 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-kube-api-access-nq22p" (OuterVolumeSpecName: "kube-api-access-nq22p") pod "a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" (UID: "a6d8c5f6-5840-4996-8edf-b0cdd3ab833a"). InnerVolumeSpecName "kube-api-access-nq22p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.651385 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" (UID: "a6d8c5f6-5840-4996-8edf-b0cdd3ab833a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.678781 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq22p\" (UniqueName: \"kubernetes.io/projected/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-kube-api-access-nq22p\") on node \"crc\" DevicePath \"\"" Jan 27 17:03:33 crc kubenswrapper[4772]: I0127 17:03:33.678838 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.020370 4772 generic.go:334] "Generic (PLEG): container finished" podID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerID="446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e" exitCode=0 Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.020438 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nn7" event={"ID":"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a","Type":"ContainerDied","Data":"446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e"} Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.020489 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b5nn7" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.020511 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b5nn7" event={"ID":"a6d8c5f6-5840-4996-8edf-b0cdd3ab833a","Type":"ContainerDied","Data":"868827379c2358104fc3ee3fd215040de500d1cf0eddf9c817b9a3388c61a59b"} Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.020558 4772 scope.go:117] "RemoveContainer" containerID="446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.042958 4772 scope.go:117] "RemoveContainer" containerID="b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.067603 4772 scope.go:117] "RemoveContainer" containerID="fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.071418 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b5nn7"] Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.079245 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b5nn7"] Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.105446 4772 scope.go:117] "RemoveContainer" containerID="446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e" Jan 27 17:03:34 crc kubenswrapper[4772]: E0127 17:03:34.105883 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e\": container with ID starting with 446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e not found: ID does not exist" containerID="446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.105948 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e"} err="failed to get container status \"446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e\": rpc error: code = NotFound desc = could not find container \"446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e\": container with ID starting with 446397c85d820ac08eaaff34c1f773386f0e789b43f7fcda38339ade6491fa4e not found: ID does not exist" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.105997 4772 scope.go:117] "RemoveContainer" containerID="b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af" Jan 27 17:03:34 crc kubenswrapper[4772]: E0127 17:03:34.106732 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af\": container with ID starting with b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af not found: ID does not exist" containerID="b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.106770 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af"} err="failed to get container status \"b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af\": rpc error: code = NotFound desc = could not find container \"b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af\": container with ID starting with b4ebb1482a887c740701c8024c6fde60a92ddf64448d43fc59fdc8d0125b04af not found: ID does not exist" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.106792 4772 scope.go:117] "RemoveContainer" containerID="fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402" Jan 27 17:03:34 crc kubenswrapper[4772]: E0127 17:03:34.107067 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402\": container with ID starting with fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402 not found: ID does not exist" containerID="fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.107120 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402"} err="failed to get container status \"fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402\": rpc error: code = NotFound desc = could not find container \"fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402\": container with ID starting with fc3b4387c7de25d81bfd6644e1b09a028e3623cf29b8e426464b8e6357b85402 not found: ID does not exist" Jan 27 17:03:34 crc kubenswrapper[4772]: I0127 17:03:34.694716 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" path="/var/lib/kubelet/pods/a6d8c5f6-5840-4996-8edf-b0cdd3ab833a/volumes" Jan 27 17:03:42 crc kubenswrapper[4772]: I0127 17:03:42.058690 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:03:42 crc kubenswrapper[4772]: I0127 17:03:42.059576 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:04:12 crc kubenswrapper[4772]: I0127 17:04:12.058104 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:04:12 crc kubenswrapper[4772]: I0127 17:04:12.058779 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.058285 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.058976 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.059027 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.059840 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.059895 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" gracePeriod=600 Jan 27 17:04:42 crc kubenswrapper[4772]: E0127 17:04:42.186363 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.773742 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" exitCode=0 Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.773810 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7"} Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.773912 4772 scope.go:117] "RemoveContainer" containerID="0c934f11ffcfb51be7cc650e76f8b239868b5820a22f2783555e83c31ae7ef8b" Jan 27 17:04:42 crc kubenswrapper[4772]: I0127 17:04:42.774905 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:04:42 crc kubenswrapper[4772]: E0127 17:04:42.775238 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:04:54 crc kubenswrapper[4772]: I0127 17:04:54.677927 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:04:54 crc kubenswrapper[4772]: E0127 17:04:54.679218 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:05:09 crc kubenswrapper[4772]: I0127 17:05:09.674883 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:05:09 crc kubenswrapper[4772]: E0127 17:05:09.677312 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:05:21 crc kubenswrapper[4772]: I0127 17:05:21.663231 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:05:21 crc kubenswrapper[4772]: E0127 17:05:21.664287 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:05:32 crc kubenswrapper[4772]: I0127 17:05:32.667337 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:05:32 crc kubenswrapper[4772]: E0127 17:05:32.668152 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:05:46 crc kubenswrapper[4772]: I0127 17:05:46.663371 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:05:46 crc kubenswrapper[4772]: E0127 17:05:46.664062 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:06:01 crc kubenswrapper[4772]: I0127 17:06:01.663381 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:06:01 crc kubenswrapper[4772]: E0127 17:06:01.664557 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:06:12 crc kubenswrapper[4772]: I0127 17:06:12.663673 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:06:12 crc kubenswrapper[4772]: E0127 17:06:12.664583 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:06:26 crc kubenswrapper[4772]: I0127 17:06:26.663959 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:06:26 crc kubenswrapper[4772]: E0127 17:06:26.665228 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:06:38 crc kubenswrapper[4772]: I0127 17:06:38.663078 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:06:38 crc kubenswrapper[4772]: E0127 17:06:38.663864 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:06:51 crc kubenswrapper[4772]: I0127 17:06:51.662948 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:06:51 crc kubenswrapper[4772]: E0127 17:06:51.663985 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:07:03 crc kubenswrapper[4772]: I0127 17:07:03.662969 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:07:03 crc kubenswrapper[4772]: E0127 17:07:03.664190 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:07:16 crc kubenswrapper[4772]: I0127 17:07:16.663523 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:07:16 crc kubenswrapper[4772]: E0127 17:07:16.665974 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:07:29 crc kubenswrapper[4772]: I0127 17:07:29.675086 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:07:29 crc kubenswrapper[4772]: E0127 17:07:29.675848 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:07:41 crc kubenswrapper[4772]: I0127 17:07:41.663312 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:07:41 crc kubenswrapper[4772]: E0127 17:07:41.664131 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:07:55 crc kubenswrapper[4772]: I0127 17:07:55.662705 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:07:55 crc kubenswrapper[4772]: E0127 17:07:55.663575 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:08:06 crc kubenswrapper[4772]: I0127 17:08:06.663775 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:08:06 crc kubenswrapper[4772]: E0127 17:08:06.664889 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:08:17 crc kubenswrapper[4772]: I0127 17:08:17.663780 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:08:17 crc kubenswrapper[4772]: E0127 17:08:17.664861 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:08:32 crc kubenswrapper[4772]: I0127 17:08:32.664235 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:08:32 crc kubenswrapper[4772]: E0127 17:08:32.665110 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.959004 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v67fk"] Jan 27 17:08:39 crc kubenswrapper[4772]: E0127 17:08:39.960099 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="extract-utilities" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.960116 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="extract-utilities" Jan 27 17:08:39 crc kubenswrapper[4772]: E0127 17:08:39.960195 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="extract-content" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.960204 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="extract-content" Jan 27 17:08:39 crc kubenswrapper[4772]: E0127 17:08:39.960218 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="registry-server" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.960227 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="registry-server" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.960472 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d8c5f6-5840-4996-8edf-b0cdd3ab833a" containerName="registry-server" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.962299 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.986493 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v67fk"] Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.996366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2ls\" (UniqueName: \"kubernetes.io/projected/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-kube-api-access-ww2ls\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.996762 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-catalog-content\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:39 crc kubenswrapper[4772]: I0127 17:08:39.996993 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-utilities\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.099579 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-utilities\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.099740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2ls\" (UniqueName: \"kubernetes.io/projected/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-kube-api-access-ww2ls\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.099856 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-catalog-content\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.100192 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-utilities\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.100398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-catalog-content\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.120994 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2ls\" (UniqueName: \"kubernetes.io/projected/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-kube-api-access-ww2ls\") pod \"redhat-operators-v67fk\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.288345 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:40 crc kubenswrapper[4772]: I0127 17:08:40.771508 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v67fk"] Jan 27 17:08:41 crc kubenswrapper[4772]: I0127 17:08:41.327329 4772 generic.go:334] "Generic (PLEG): container finished" podID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerID="12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057" exitCode=0 Jan 27 17:08:41 crc kubenswrapper[4772]: I0127 17:08:41.327551 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerDied","Data":"12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057"} Jan 27 17:08:41 crc kubenswrapper[4772]: I0127 17:08:41.327694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerStarted","Data":"d4542b704bdfa74b42beaef5f64efafc7ed30121010c3bdd793a4aaa21d8cfa5"} Jan 27 17:08:41 crc kubenswrapper[4772]: I0127 17:08:41.329922 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:08:42 crc kubenswrapper[4772]: I0127 17:08:42.339529 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerStarted","Data":"06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c"} Jan 27 17:08:43 crc kubenswrapper[4772]: I0127 17:08:43.360050 4772 generic.go:334] "Generic (PLEG): container finished" podID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerID="06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c" exitCode=0 Jan 27 17:08:43 crc kubenswrapper[4772]: I0127 17:08:43.360126 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerDied","Data":"06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c"} Jan 27 17:08:43 crc kubenswrapper[4772]: I0127 17:08:43.664025 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:08:43 crc kubenswrapper[4772]: E0127 17:08:43.664928 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:08:45 crc kubenswrapper[4772]: I0127 17:08:45.406388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerStarted","Data":"670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f"} Jan 27 17:08:45 crc kubenswrapper[4772]: I0127 17:08:45.434622 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v67fk" podStartSLOduration=3.1334585600000002 podStartE2EDuration="6.434605184s" podCreationTimestamp="2026-01-27 17:08:39 +0000 UTC" firstStartedPulling="2026-01-27 17:08:41.329638392 +0000 UTC m=+7307.310247480" lastFinishedPulling="2026-01-27 17:08:44.630784996 +0000 UTC m=+7310.611394104" observedRunningTime="2026-01-27 17:08:45.433025579 +0000 UTC m=+7311.413634687" watchObservedRunningTime="2026-01-27 17:08:45.434605184 +0000 UTC m=+7311.415214282" Jan 27 17:08:50 crc kubenswrapper[4772]: I0127 17:08:50.289557 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:50 crc kubenswrapper[4772]: I0127 17:08:50.291004 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:08:51 crc kubenswrapper[4772]: I0127 17:08:51.347092 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v67fk" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="registry-server" probeResult="failure" output=< Jan 27 17:08:51 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 17:08:51 crc kubenswrapper[4772]: > Jan 27 17:08:54 crc kubenswrapper[4772]: I0127 17:08:54.676563 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:08:54 crc kubenswrapper[4772]: E0127 17:08:54.677478 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:09:00 crc kubenswrapper[4772]: I0127 17:09:00.358803 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:09:00 crc kubenswrapper[4772]: I0127 17:09:00.422390 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:09:00 crc kubenswrapper[4772]: I0127 17:09:00.606217 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v67fk"] Jan 27 17:09:01 crc kubenswrapper[4772]: I0127 17:09:01.568536 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v67fk" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="registry-server" containerID="cri-o://670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f" gracePeriod=2 Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.129570 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.274651 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww2ls\" (UniqueName: \"kubernetes.io/projected/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-kube-api-access-ww2ls\") pod \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.274782 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-catalog-content\") pod \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.275023 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-utilities\") pod \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\" (UID: \"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a\") " Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.275743 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-utilities" (OuterVolumeSpecName: "utilities") pod "4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" (UID: "4696c11b-cd7f-4a9b-84d0-b4e59ff4383a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.284558 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-kube-api-access-ww2ls" (OuterVolumeSpecName: "kube-api-access-ww2ls") pod "4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" (UID: "4696c11b-cd7f-4a9b-84d0-b4e59ff4383a"). InnerVolumeSpecName "kube-api-access-ww2ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.377508 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.377747 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww2ls\" (UniqueName: \"kubernetes.io/projected/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-kube-api-access-ww2ls\") on node \"crc\" DevicePath \"\"" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.402548 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" (UID: "4696c11b-cd7f-4a9b-84d0-b4e59ff4383a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.479481 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.586540 4772 generic.go:334] "Generic (PLEG): container finished" podID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerID="670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f" exitCode=0 Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.586600 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerDied","Data":"670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f"} Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.586641 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v67fk" event={"ID":"4696c11b-cd7f-4a9b-84d0-b4e59ff4383a","Type":"ContainerDied","Data":"d4542b704bdfa74b42beaef5f64efafc7ed30121010c3bdd793a4aaa21d8cfa5"} Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.586645 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v67fk" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.586699 4772 scope.go:117] "RemoveContainer" containerID="670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.624854 4772 scope.go:117] "RemoveContainer" containerID="06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.659992 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v67fk"] Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.678329 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v67fk"] Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.682157 4772 scope.go:117] "RemoveContainer" containerID="12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.724517 4772 scope.go:117] "RemoveContainer" containerID="670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f" Jan 27 17:09:02 crc kubenswrapper[4772]: E0127 17:09:02.725328 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f\": container with ID starting with 670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f not found: ID does not exist" containerID="670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.725384 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f"} err="failed to get container status \"670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f\": rpc error: code = NotFound desc = could not find container \"670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f\": container with ID starting with 670c3dd5d1eadcc4c7af0be85bbbd52745f9845379d48bccf45b532e278f530f not found: ID does not exist" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.725421 4772 scope.go:117] "RemoveContainer" containerID="06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c" Jan 27 17:09:02 crc kubenswrapper[4772]: E0127 17:09:02.725870 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c\": container with ID starting with 06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c not found: ID does not exist" containerID="06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.726083 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c"} err="failed to get container status \"06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c\": rpc error: code = NotFound desc = could not find container \"06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c\": container with ID starting with 06a1f585dd0be4dfae7eee556971dc6e4e153c2aa01f033489b777fe2060b96c not found: ID does not exist" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.726325 4772 scope.go:117] "RemoveContainer" containerID="12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057" Jan 27 17:09:02 crc kubenswrapper[4772]: E0127 17:09:02.727123 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057\": container with ID starting with 12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057 not found: ID does not exist" containerID="12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057" Jan 27 17:09:02 crc kubenswrapper[4772]: I0127 17:09:02.727220 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057"} err="failed to get container status \"12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057\": rpc error: code = NotFound desc = could not find container \"12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057\": container with ID starting with 12499ba818facb5728a5e2af322ab50a280b5ba47cb8956bb6a0e6872c81b057 not found: ID does not exist" Jan 27 17:09:04 crc kubenswrapper[4772]: I0127 17:09:04.685677 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" path="/var/lib/kubelet/pods/4696c11b-cd7f-4a9b-84d0-b4e59ff4383a/volumes" Jan 27 17:09:09 crc kubenswrapper[4772]: I0127 17:09:09.662756 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:09:09 crc kubenswrapper[4772]: E0127 17:09:09.664555 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:09:23 crc kubenswrapper[4772]: I0127 17:09:23.662797 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:09:23 crc kubenswrapper[4772]: E0127 17:09:23.663557 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:09:38 crc kubenswrapper[4772]: I0127 17:09:38.664513 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:09:38 crc kubenswrapper[4772]: E0127 17:09:38.665525 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:09:52 crc kubenswrapper[4772]: I0127 17:09:52.663343 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:09:53 crc kubenswrapper[4772]: I0127 17:09:53.129744 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"ea5d30742e90c5fe244c125256ab670ebb26e9206885008781626f7b363771b3"} Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.973355 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kb62q"] Jan 27 17:11:11 crc kubenswrapper[4772]: E0127 17:11:11.974531 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="extract-utilities" Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.974550 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="extract-utilities" Jan 27 17:11:11 crc kubenswrapper[4772]: E0127 17:11:11.974568 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="extract-content" Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.974578 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="extract-content" Jan 27 17:11:11 crc kubenswrapper[4772]: E0127 17:11:11.974595 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="registry-server" Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.974604 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="registry-server" Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.974835 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4696c11b-cd7f-4a9b-84d0-b4e59ff4383a" containerName="registry-server" Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.976773 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:11 crc kubenswrapper[4772]: I0127 17:11:11.997269 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kb62q"] Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.075619 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjx6\" (UniqueName: \"kubernetes.io/projected/e840c358-8d41-4381-b643-3bd35f0716a2-kube-api-access-wgjx6\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.075692 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e840c358-8d41-4381-b643-3bd35f0716a2-catalog-content\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.076065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e840c358-8d41-4381-b643-3bd35f0716a2-utilities\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.178346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e840c358-8d41-4381-b643-3bd35f0716a2-utilities\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.178446 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjx6\" (UniqueName: \"kubernetes.io/projected/e840c358-8d41-4381-b643-3bd35f0716a2-kube-api-access-wgjx6\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.178476 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e840c358-8d41-4381-b643-3bd35f0716a2-catalog-content\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.178928 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e840c358-8d41-4381-b643-3bd35f0716a2-utilities\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.178967 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e840c358-8d41-4381-b643-3bd35f0716a2-catalog-content\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.197970 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjx6\" (UniqueName: \"kubernetes.io/projected/e840c358-8d41-4381-b643-3bd35f0716a2-kube-api-access-wgjx6\") pod \"community-operators-kb62q\" (UID: \"e840c358-8d41-4381-b643-3bd35f0716a2\") " pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.319522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.850741 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kb62q"] Jan 27 17:11:12 crc kubenswrapper[4772]: I0127 17:11:12.958138 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb62q" event={"ID":"e840c358-8d41-4381-b643-3bd35f0716a2","Type":"ContainerStarted","Data":"2d27b8d9a06391c00a3262da5e1b5a9e4e974a9ab2faf27c3443e60c44269845"} Jan 27 17:11:13 crc kubenswrapper[4772]: I0127 17:11:13.972304 4772 generic.go:334] "Generic (PLEG): container finished" podID="e840c358-8d41-4381-b643-3bd35f0716a2" containerID="8c0024f9c25988180a3cb150988e4b597cfed85d7eb62aa9274ef355d1d88296" exitCode=0 Jan 27 17:11:13 crc kubenswrapper[4772]: I0127 17:11:13.972345 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb62q" event={"ID":"e840c358-8d41-4381-b643-3bd35f0716a2","Type":"ContainerDied","Data":"8c0024f9c25988180a3cb150988e4b597cfed85d7eb62aa9274ef355d1d88296"} Jan 27 17:11:18 crc kubenswrapper[4772]: I0127 17:11:18.016687 4772 generic.go:334] "Generic (PLEG): container finished" podID="e840c358-8d41-4381-b643-3bd35f0716a2" containerID="7949054364bb39054181aad8e729a07d19b95a294c767430f2b49200025c69c8" exitCode=0 Jan 27 17:11:18 crc kubenswrapper[4772]: I0127 17:11:18.017009 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb62q" event={"ID":"e840c358-8d41-4381-b643-3bd35f0716a2","Type":"ContainerDied","Data":"7949054364bb39054181aad8e729a07d19b95a294c767430f2b49200025c69c8"} Jan 27 17:11:19 crc kubenswrapper[4772]: I0127 17:11:19.031966 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kb62q" event={"ID":"e840c358-8d41-4381-b643-3bd35f0716a2","Type":"ContainerStarted","Data":"3cf860b49847ad6631c20f8f8a781d32c0a504d63e63a22b870a53c993267fd3"} Jan 27 17:11:19 crc kubenswrapper[4772]: I0127 17:11:19.073282 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kb62q" podStartSLOduration=3.59528929 podStartE2EDuration="8.07325119s" podCreationTimestamp="2026-01-27 17:11:11 +0000 UTC" firstStartedPulling="2026-01-27 17:11:13.974781272 +0000 UTC m=+7459.955390380" lastFinishedPulling="2026-01-27 17:11:18.452743182 +0000 UTC m=+7464.433352280" observedRunningTime="2026-01-27 17:11:19.055350002 +0000 UTC m=+7465.035959140" watchObservedRunningTime="2026-01-27 17:11:19.07325119 +0000 UTC m=+7465.053860328" Jan 27 17:11:22 crc kubenswrapper[4772]: I0127 17:11:22.320914 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:22 crc kubenswrapper[4772]: I0127 17:11:22.321516 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:22 crc kubenswrapper[4772]: I0127 17:11:22.418848 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:23 crc kubenswrapper[4772]: I0127 17:11:23.156094 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kb62q" Jan 27 17:11:23 crc kubenswrapper[4772]: I0127 17:11:23.283314 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kb62q"] Jan 27 17:11:23 crc kubenswrapper[4772]: I0127 17:11:23.352346 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:11:23 crc kubenswrapper[4772]: I0127 17:11:23.352640 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-txfct" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="registry-server" containerID="cri-o://114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93" gracePeriod=2 Jan 27 17:11:23 crc kubenswrapper[4772]: I0127 17:11:23.912031 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.031452 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-catalog-content\") pod \"10eef819-4355-4b65-bb81-95c055327034\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.031638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-utilities\") pod \"10eef819-4355-4b65-bb81-95c055327034\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.031698 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhfw\" (UniqueName: \"kubernetes.io/projected/10eef819-4355-4b65-bb81-95c055327034-kube-api-access-tkhfw\") pod \"10eef819-4355-4b65-bb81-95c055327034\" (UID: \"10eef819-4355-4b65-bb81-95c055327034\") " Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.032224 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-utilities" (OuterVolumeSpecName: "utilities") pod "10eef819-4355-4b65-bb81-95c055327034" (UID: "10eef819-4355-4b65-bb81-95c055327034"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.041514 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10eef819-4355-4b65-bb81-95c055327034-kube-api-access-tkhfw" (OuterVolumeSpecName: "kube-api-access-tkhfw") pod "10eef819-4355-4b65-bb81-95c055327034" (UID: "10eef819-4355-4b65-bb81-95c055327034"). InnerVolumeSpecName "kube-api-access-tkhfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.084033 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10eef819-4355-4b65-bb81-95c055327034" (UID: "10eef819-4355-4b65-bb81-95c055327034"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.091305 4772 generic.go:334] "Generic (PLEG): container finished" podID="10eef819-4355-4b65-bb81-95c055327034" containerID="114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93" exitCode=0 Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.092146 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txfct" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.092372 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerDied","Data":"114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93"} Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.092445 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txfct" event={"ID":"10eef819-4355-4b65-bb81-95c055327034","Type":"ContainerDied","Data":"23557e0504a3fe336145be5a361937c216f6e8cad04bfe40c3131be90c233123"} Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.092477 4772 scope.go:117] "RemoveContainer" containerID="114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.120431 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.127240 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-txfct"] Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.130037 4772 scope.go:117] "RemoveContainer" containerID="490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.133586 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.133613 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhfw\" (UniqueName: \"kubernetes.io/projected/10eef819-4355-4b65-bb81-95c055327034-kube-api-access-tkhfw\") on node \"crc\" DevicePath \"\"" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.133625 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10eef819-4355-4b65-bb81-95c055327034-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.150682 4772 scope.go:117] "RemoveContainer" containerID="7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.191275 4772 scope.go:117] "RemoveContainer" containerID="114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93" Jan 27 17:11:24 crc kubenswrapper[4772]: E0127 17:11:24.191802 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93\": container with ID starting with 114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93 not found: ID does not exist" containerID="114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.191834 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93"} err="failed to get container status \"114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93\": rpc error: code = NotFound desc = could not find container \"114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93\": container with ID starting with 114bf805859534c9af2af0c87c4604c2095e959428fa0c7464d58e26e36fef93 not found: ID does not exist" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.191855 4772 scope.go:117] "RemoveContainer" containerID="490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605" Jan 27 17:11:24 crc kubenswrapper[4772]: E0127 17:11:24.192094 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605\": container with ID starting with 490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605 not found: ID does not exist" containerID="490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.192125 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605"} err="failed to get container status \"490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605\": rpc error: code = NotFound desc = could not find container \"490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605\": container with ID starting with 490d9faa657579105d3e49ba461cb9e24406692936441c0207c58ab54ad55605 not found: ID does not exist" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.192139 4772 scope.go:117] "RemoveContainer" containerID="7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c" Jan 27 17:11:24 crc kubenswrapper[4772]: E0127 17:11:24.192360 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c\": container with ID starting with 7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c not found: ID does not exist" containerID="7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.192387 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c"} err="failed to get container status \"7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c\": rpc error: code = NotFound desc = could not find container \"7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c\": container with ID starting with 7236c5d9dd7b59e4e1b4af9645ea7ed61b907d3619dcb83d87301be5c3316f7c not found: ID does not exist" Jan 27 17:11:24 crc kubenswrapper[4772]: I0127 17:11:24.677076 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10eef819-4355-4b65-bb81-95c055327034" path="/var/lib/kubelet/pods/10eef819-4355-4b65-bb81-95c055327034/volumes" Jan 27 17:12:12 crc kubenswrapper[4772]: I0127 17:12:12.059390 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:12:12 crc kubenswrapper[4772]: I0127 17:12:12.062140 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.281323 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kch9d"] Jan 27 17:12:25 crc kubenswrapper[4772]: E0127 17:12:25.286713 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="extract-utilities" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.286864 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="extract-utilities" Jan 27 17:12:25 crc kubenswrapper[4772]: E0127 17:12:25.286964 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="extract-content" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.287050 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="extract-content" Jan 27 17:12:25 crc kubenswrapper[4772]: E0127 17:12:25.287145 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="registry-server" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.287247 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="registry-server" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.287593 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="10eef819-4355-4b65-bb81-95c055327034" containerName="registry-server" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.289462 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.315953 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kch9d"] Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.355864 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-utilities\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.356068 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-catalog-content\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.356158 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbjdb\" (UniqueName: \"kubernetes.io/projected/bb293122-6b7f-4867-a949-55eb128d7ed4-kube-api-access-qbjdb\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.458011 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-utilities\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.458148 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-catalog-content\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.458226 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbjdb\" (UniqueName: \"kubernetes.io/projected/bb293122-6b7f-4867-a949-55eb128d7ed4-kube-api-access-qbjdb\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.458789 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-catalog-content\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.459065 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-utilities\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.479589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbjdb\" (UniqueName: \"kubernetes.io/projected/bb293122-6b7f-4867-a949-55eb128d7ed4-kube-api-access-qbjdb\") pod \"redhat-marketplace-kch9d\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:25 crc kubenswrapper[4772]: I0127 17:12:25.627512 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:26 crc kubenswrapper[4772]: I0127 17:12:26.174775 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kch9d"] Jan 27 17:12:26 crc kubenswrapper[4772]: W0127 17:12:26.183498 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb293122_6b7f_4867_a949_55eb128d7ed4.slice/crio-f69b93ed1f2e5c33fe404e803dd9066123c9ea68b33c570a0dd62ead30b49ded WatchSource:0}: Error finding container f69b93ed1f2e5c33fe404e803dd9066123c9ea68b33c570a0dd62ead30b49ded: Status 404 returned error can't find the container with id f69b93ed1f2e5c33fe404e803dd9066123c9ea68b33c570a0dd62ead30b49ded Jan 27 17:12:26 crc kubenswrapper[4772]: I0127 17:12:26.790892 4772 generic.go:334] "Generic (PLEG): container finished" podID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerID="1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde" exitCode=0 Jan 27 17:12:26 crc kubenswrapper[4772]: I0127 17:12:26.790931 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kch9d" event={"ID":"bb293122-6b7f-4867-a949-55eb128d7ed4","Type":"ContainerDied","Data":"1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde"} Jan 27 17:12:26 crc kubenswrapper[4772]: I0127 17:12:26.790957 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kch9d" event={"ID":"bb293122-6b7f-4867-a949-55eb128d7ed4","Type":"ContainerStarted","Data":"f69b93ed1f2e5c33fe404e803dd9066123c9ea68b33c570a0dd62ead30b49ded"} Jan 27 17:12:28 crc kubenswrapper[4772]: I0127 17:12:28.812677 4772 generic.go:334] "Generic (PLEG): container finished" podID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerID="6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2" exitCode=0 Jan 27 17:12:28 crc kubenswrapper[4772]: I0127 17:12:28.812806 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kch9d" event={"ID":"bb293122-6b7f-4867-a949-55eb128d7ed4","Type":"ContainerDied","Data":"6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2"} Jan 27 17:12:33 crc kubenswrapper[4772]: I0127 17:12:33.856145 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kch9d" event={"ID":"bb293122-6b7f-4867-a949-55eb128d7ed4","Type":"ContainerStarted","Data":"606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d"} Jan 27 17:12:33 crc kubenswrapper[4772]: I0127 17:12:33.877052 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kch9d" podStartSLOduration=2.395932153 podStartE2EDuration="8.877037175s" podCreationTimestamp="2026-01-27 17:12:25 +0000 UTC" firstStartedPulling="2026-01-27 17:12:26.792234651 +0000 UTC m=+7532.772843749" lastFinishedPulling="2026-01-27 17:12:33.273339673 +0000 UTC m=+7539.253948771" observedRunningTime="2026-01-27 17:12:33.875693857 +0000 UTC m=+7539.856302955" watchObservedRunningTime="2026-01-27 17:12:33.877037175 +0000 UTC m=+7539.857646273" Jan 27 17:12:35 crc kubenswrapper[4772]: I0127 17:12:35.628565 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:35 crc kubenswrapper[4772]: I0127 17:12:35.629083 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:35 crc kubenswrapper[4772]: I0127 17:12:35.711456 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:42 crc kubenswrapper[4772]: I0127 17:12:42.059056 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:12:42 crc kubenswrapper[4772]: I0127 17:12:42.060442 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:12:45 crc kubenswrapper[4772]: I0127 17:12:45.687833 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:45 crc kubenswrapper[4772]: I0127 17:12:45.747828 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kch9d"] Jan 27 17:12:45 crc kubenswrapper[4772]: I0127 17:12:45.987644 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kch9d" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="registry-server" containerID="cri-o://606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d" gracePeriod=2 Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.501057 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.628434 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-catalog-content\") pod \"bb293122-6b7f-4867-a949-55eb128d7ed4\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.628633 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-utilities\") pod \"bb293122-6b7f-4867-a949-55eb128d7ed4\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.628665 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbjdb\" (UniqueName: \"kubernetes.io/projected/bb293122-6b7f-4867-a949-55eb128d7ed4-kube-api-access-qbjdb\") pod \"bb293122-6b7f-4867-a949-55eb128d7ed4\" (UID: \"bb293122-6b7f-4867-a949-55eb128d7ed4\") " Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.629858 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-utilities" (OuterVolumeSpecName: "utilities") pod "bb293122-6b7f-4867-a949-55eb128d7ed4" (UID: "bb293122-6b7f-4867-a949-55eb128d7ed4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.641463 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb293122-6b7f-4867-a949-55eb128d7ed4-kube-api-access-qbjdb" (OuterVolumeSpecName: "kube-api-access-qbjdb") pod "bb293122-6b7f-4867-a949-55eb128d7ed4" (UID: "bb293122-6b7f-4867-a949-55eb128d7ed4"). InnerVolumeSpecName "kube-api-access-qbjdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.662228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb293122-6b7f-4867-a949-55eb128d7ed4" (UID: "bb293122-6b7f-4867-a949-55eb128d7ed4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.732324 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.733145 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbjdb\" (UniqueName: \"kubernetes.io/projected/bb293122-6b7f-4867-a949-55eb128d7ed4-kube-api-access-qbjdb\") on node \"crc\" DevicePath \"\"" Jan 27 17:12:46 crc kubenswrapper[4772]: I0127 17:12:46.733220 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb293122-6b7f-4867-a949-55eb128d7ed4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.005574 4772 generic.go:334] "Generic (PLEG): container finished" podID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerID="606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d" exitCode=0 Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.005626 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kch9d" event={"ID":"bb293122-6b7f-4867-a949-55eb128d7ed4","Type":"ContainerDied","Data":"606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d"} Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.005658 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kch9d" event={"ID":"bb293122-6b7f-4867-a949-55eb128d7ed4","Type":"ContainerDied","Data":"f69b93ed1f2e5c33fe404e803dd9066123c9ea68b33c570a0dd62ead30b49ded"} Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.005682 4772 scope.go:117] "RemoveContainer" containerID="606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.005725 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kch9d" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.049163 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kch9d"] Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.054475 4772 scope.go:117] "RemoveContainer" containerID="6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.059414 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kch9d"] Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.080201 4772 scope.go:117] "RemoveContainer" containerID="1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.138686 4772 scope.go:117] "RemoveContainer" containerID="606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d" Jan 27 17:12:47 crc kubenswrapper[4772]: E0127 17:12:47.139238 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d\": container with ID starting with 606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d not found: ID does not exist" containerID="606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.139281 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d"} err="failed to get container status \"606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d\": rpc error: code = NotFound desc = could not find container \"606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d\": container with ID starting with 606a545c1b864cd576bd005c79dedbfba1ea800f4253958444bbd989a1dca52d not found: ID does not exist" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.139310 4772 scope.go:117] "RemoveContainer" containerID="6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2" Jan 27 17:12:47 crc kubenswrapper[4772]: E0127 17:12:47.139678 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2\": container with ID starting with 6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2 not found: ID does not exist" containerID="6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.139718 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2"} err="failed to get container status \"6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2\": rpc error: code = NotFound desc = could not find container \"6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2\": container with ID starting with 6d808146785900f1a51eaac3c8b810a51d7e6d7fdd2da00ad5472854f5f87dc2 not found: ID does not exist" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.139740 4772 scope.go:117] "RemoveContainer" containerID="1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde" Jan 27 17:12:47 crc kubenswrapper[4772]: E0127 17:12:47.140085 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde\": container with ID starting with 1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde not found: ID does not exist" containerID="1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde" Jan 27 17:12:47 crc kubenswrapper[4772]: I0127 17:12:47.140111 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde"} err="failed to get container status \"1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde\": rpc error: code = NotFound desc = could not find container \"1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde\": container with ID starting with 1465504341b8ca1cf6d84aac554bd7d0847a03f9111d50be755b674dab248fde not found: ID does not exist" Jan 27 17:12:48 crc kubenswrapper[4772]: I0127 17:12:48.683945 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" path="/var/lib/kubelet/pods/bb293122-6b7f-4867-a949-55eb128d7ed4/volumes" Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.059109 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.060112 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.060229 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.061492 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea5d30742e90c5fe244c125256ab670ebb26e9206885008781626f7b363771b3"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.061598 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://ea5d30742e90c5fe244c125256ab670ebb26e9206885008781626f7b363771b3" gracePeriod=600 Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.742607 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="ea5d30742e90c5fe244c125256ab670ebb26e9206885008781626f7b363771b3" exitCode=0 Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.742699 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"ea5d30742e90c5fe244c125256ab670ebb26e9206885008781626f7b363771b3"} Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.743008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246"} Jan 27 17:13:12 crc kubenswrapper[4772]: I0127 17:13:12.743050 4772 scope.go:117] "RemoveContainer" containerID="9bc4d0691b7d281178157feda7a06246e962bb43aa27764495f3ea77eef906b7" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.877880 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlnbw"] Jan 27 17:14:17 crc kubenswrapper[4772]: E0127 17:14:17.879238 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="registry-server" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.879261 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="registry-server" Jan 27 17:14:17 crc kubenswrapper[4772]: E0127 17:14:17.879286 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="extract-content" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.879297 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="extract-content" Jan 27 17:14:17 crc kubenswrapper[4772]: E0127 17:14:17.879322 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="extract-utilities" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.879334 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="extract-utilities" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.879635 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb293122-6b7f-4867-a949-55eb128d7ed4" containerName="registry-server" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.882076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:17 crc kubenswrapper[4772]: I0127 17:14:17.888144 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlnbw"] Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.016731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-catalog-content\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.017070 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-utilities\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.017092 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ndg\" (UniqueName: \"kubernetes.io/projected/a9192220-7559-4548-a923-ff5096e93763-kube-api-access-c6ndg\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.119160 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-catalog-content\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.119342 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-utilities\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.119393 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ndg\" (UniqueName: \"kubernetes.io/projected/a9192220-7559-4548-a923-ff5096e93763-kube-api-access-c6ndg\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.119697 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-catalog-content\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.120086 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-utilities\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.142714 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ndg\" (UniqueName: \"kubernetes.io/projected/a9192220-7559-4548-a923-ff5096e93763-kube-api-access-c6ndg\") pod \"certified-operators-dlnbw\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.213391 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:18 crc kubenswrapper[4772]: I0127 17:14:18.548535 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlnbw"] Jan 27 17:14:19 crc kubenswrapper[4772]: I0127 17:14:19.502083 4772 generic.go:334] "Generic (PLEG): container finished" podID="a9192220-7559-4548-a923-ff5096e93763" containerID="dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa" exitCode=0 Jan 27 17:14:19 crc kubenswrapper[4772]: I0127 17:14:19.502205 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlnbw" event={"ID":"a9192220-7559-4548-a923-ff5096e93763","Type":"ContainerDied","Data":"dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa"} Jan 27 17:14:19 crc kubenswrapper[4772]: I0127 17:14:19.502485 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlnbw" event={"ID":"a9192220-7559-4548-a923-ff5096e93763","Type":"ContainerStarted","Data":"ce84b67e6b038a4d23f97f5b2ddc4a454987ce4f78ab313a4e04e788041f4d13"} Jan 27 17:14:19 crc kubenswrapper[4772]: I0127 17:14:19.506717 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:14:21 crc kubenswrapper[4772]: I0127 17:14:21.526348 4772 generic.go:334] "Generic (PLEG): container finished" podID="a9192220-7559-4548-a923-ff5096e93763" containerID="3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9" exitCode=0 Jan 27 17:14:21 crc kubenswrapper[4772]: I0127 17:14:21.527249 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlnbw" event={"ID":"a9192220-7559-4548-a923-ff5096e93763","Type":"ContainerDied","Data":"3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9"} Jan 27 17:14:22 crc kubenswrapper[4772]: I0127 17:14:22.538007 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlnbw" event={"ID":"a9192220-7559-4548-a923-ff5096e93763","Type":"ContainerStarted","Data":"3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a"} Jan 27 17:14:22 crc kubenswrapper[4772]: I0127 17:14:22.563158 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlnbw" podStartSLOduration=3.069485953 podStartE2EDuration="5.563143755s" podCreationTimestamp="2026-01-27 17:14:17 +0000 UTC" firstStartedPulling="2026-01-27 17:14:19.506504015 +0000 UTC m=+7645.487113113" lastFinishedPulling="2026-01-27 17:14:22.000161807 +0000 UTC m=+7647.980770915" observedRunningTime="2026-01-27 17:14:22.554067817 +0000 UTC m=+7648.534676915" watchObservedRunningTime="2026-01-27 17:14:22.563143755 +0000 UTC m=+7648.543752853" Jan 27 17:14:28 crc kubenswrapper[4772]: I0127 17:14:28.214047 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:28 crc kubenswrapper[4772]: I0127 17:14:28.215735 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:28 crc kubenswrapper[4772]: I0127 17:14:28.296552 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:28 crc kubenswrapper[4772]: I0127 17:14:28.643082 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:28 crc kubenswrapper[4772]: I0127 17:14:28.702466 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlnbw"] Jan 27 17:14:30 crc kubenswrapper[4772]: I0127 17:14:30.616433 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlnbw" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="registry-server" containerID="cri-o://3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a" gracePeriod=2 Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.134796 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.211642 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-utilities\") pod \"a9192220-7559-4548-a923-ff5096e93763\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.211733 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-catalog-content\") pod \"a9192220-7559-4548-a923-ff5096e93763\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.211774 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ndg\" (UniqueName: \"kubernetes.io/projected/a9192220-7559-4548-a923-ff5096e93763-kube-api-access-c6ndg\") pod \"a9192220-7559-4548-a923-ff5096e93763\" (UID: \"a9192220-7559-4548-a923-ff5096e93763\") " Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.213406 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-utilities" (OuterVolumeSpecName: "utilities") pod "a9192220-7559-4548-a923-ff5096e93763" (UID: "a9192220-7559-4548-a923-ff5096e93763"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.218241 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9192220-7559-4548-a923-ff5096e93763-kube-api-access-c6ndg" (OuterVolumeSpecName: "kube-api-access-c6ndg") pod "a9192220-7559-4548-a923-ff5096e93763" (UID: "a9192220-7559-4548-a923-ff5096e93763"). InnerVolumeSpecName "kube-api-access-c6ndg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.314063 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.314112 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ndg\" (UniqueName: \"kubernetes.io/projected/a9192220-7559-4548-a923-ff5096e93763-kube-api-access-c6ndg\") on node \"crc\" DevicePath \"\"" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.629967 4772 generic.go:334] "Generic (PLEG): container finished" podID="a9192220-7559-4548-a923-ff5096e93763" containerID="3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a" exitCode=0 Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.630040 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlnbw" event={"ID":"a9192220-7559-4548-a923-ff5096e93763","Type":"ContainerDied","Data":"3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a"} Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.630085 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlnbw" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.630115 4772 scope.go:117] "RemoveContainer" containerID="3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.630095 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlnbw" event={"ID":"a9192220-7559-4548-a923-ff5096e93763","Type":"ContainerDied","Data":"ce84b67e6b038a4d23f97f5b2ddc4a454987ce4f78ab313a4e04e788041f4d13"} Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.663531 4772 scope.go:117] "RemoveContainer" containerID="3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.698013 4772 scope.go:117] "RemoveContainer" containerID="dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.739691 4772 scope.go:117] "RemoveContainer" containerID="3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a" Jan 27 17:14:31 crc kubenswrapper[4772]: E0127 17:14:31.740254 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a\": container with ID starting with 3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a not found: ID does not exist" containerID="3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.740322 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a"} err="failed to get container status \"3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a\": rpc error: code = NotFound desc = could not find container \"3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a\": container with ID starting with 3fa8645b6717e26a0c6cd856f0a02932bce52294060a03d033d72ec5b711534a not found: ID does not exist" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.740366 4772 scope.go:117] "RemoveContainer" containerID="3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9" Jan 27 17:14:31 crc kubenswrapper[4772]: E0127 17:14:31.741066 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9\": container with ID starting with 3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9 not found: ID does not exist" containerID="3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.741103 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9"} err="failed to get container status \"3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9\": rpc error: code = NotFound desc = could not find container \"3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9\": container with ID starting with 3d5249bc1ee6cc2a2ba5b5dd83f216326e35b40984cb41a3156586b5873760f9 not found: ID does not exist" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.741131 4772 scope.go:117] "RemoveContainer" containerID="dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa" Jan 27 17:14:31 crc kubenswrapper[4772]: E0127 17:14:31.741529 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa\": container with ID starting with dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa not found: ID does not exist" containerID="dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.741566 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa"} err="failed to get container status \"dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa\": rpc error: code = NotFound desc = could not find container \"dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa\": container with ID starting with dbaaaa681dd7df9d44b3fd95409448538cdae1fcb4228db607f07ec31e16f8aa not found: ID does not exist" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.895824 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9192220-7559-4548-a923-ff5096e93763" (UID: "a9192220-7559-4548-a923-ff5096e93763"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.927102 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9192220-7559-4548-a923-ff5096e93763-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.976600 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlnbw"] Jan 27 17:14:31 crc kubenswrapper[4772]: I0127 17:14:31.983770 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlnbw"] Jan 27 17:14:32 crc kubenswrapper[4772]: I0127 17:14:32.676656 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9192220-7559-4548-a923-ff5096e93763" path="/var/lib/kubelet/pods/a9192220-7559-4548-a923-ff5096e93763/volumes" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.199208 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm"] Jan 27 17:15:00 crc kubenswrapper[4772]: E0127 17:15:00.200188 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="extract-content" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.200203 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="extract-content" Jan 27 17:15:00 crc kubenswrapper[4772]: E0127 17:15:00.200238 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="registry-server" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.200244 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="registry-server" Jan 27 17:15:00 crc kubenswrapper[4772]: E0127 17:15:00.200260 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="extract-utilities" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.200267 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="extract-utilities" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.200449 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9192220-7559-4548-a923-ff5096e93763" containerName="registry-server" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.201112 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.204669 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.204950 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.229123 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm"] Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.305221 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0db9aea1-8a66-4447-9150-812172da2a26-secret-volume\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.305353 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qmx7\" (UniqueName: \"kubernetes.io/projected/0db9aea1-8a66-4447-9150-812172da2a26-kube-api-access-6qmx7\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.305396 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0db9aea1-8a66-4447-9150-812172da2a26-config-volume\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.407151 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0db9aea1-8a66-4447-9150-812172da2a26-secret-volume\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.407328 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qmx7\" (UniqueName: \"kubernetes.io/projected/0db9aea1-8a66-4447-9150-812172da2a26-kube-api-access-6qmx7\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.407373 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0db9aea1-8a66-4447-9150-812172da2a26-config-volume\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.408496 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0db9aea1-8a66-4447-9150-812172da2a26-config-volume\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.416270 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0db9aea1-8a66-4447-9150-812172da2a26-secret-volume\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.425594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qmx7\" (UniqueName: \"kubernetes.io/projected/0db9aea1-8a66-4447-9150-812172da2a26-kube-api-access-6qmx7\") pod \"collect-profiles-29492235-blstm\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:00 crc kubenswrapper[4772]: I0127 17:15:00.534432 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:01 crc kubenswrapper[4772]: I0127 17:15:01.066642 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm"] Jan 27 17:15:01 crc kubenswrapper[4772]: I0127 17:15:01.256783 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" event={"ID":"0db9aea1-8a66-4447-9150-812172da2a26","Type":"ContainerStarted","Data":"8cac6045fccfbe1e377f14b47042375c5f42ee5f5f23e9bb7094c2ffd4201d8c"} Jan 27 17:15:01 crc kubenswrapper[4772]: I0127 17:15:01.259573 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" event={"ID":"0db9aea1-8a66-4447-9150-812172da2a26","Type":"ContainerStarted","Data":"336e829070d890d5276fa9fc7334182d780e1d99b11bda8599d5a4ab4962b513"} Jan 27 17:15:01 crc kubenswrapper[4772]: I0127 17:15:01.275375 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" podStartSLOduration=1.275351768 podStartE2EDuration="1.275351768s" podCreationTimestamp="2026-01-27 17:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 17:15:01.270600664 +0000 UTC m=+7687.251209772" watchObservedRunningTime="2026-01-27 17:15:01.275351768 +0000 UTC m=+7687.255960866" Jan 27 17:15:02 crc kubenswrapper[4772]: I0127 17:15:02.268063 4772 generic.go:334] "Generic (PLEG): container finished" podID="0db9aea1-8a66-4447-9150-812172da2a26" containerID="8cac6045fccfbe1e377f14b47042375c5f42ee5f5f23e9bb7094c2ffd4201d8c" exitCode=0 Jan 27 17:15:02 crc kubenswrapper[4772]: I0127 17:15:02.268334 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" event={"ID":"0db9aea1-8a66-4447-9150-812172da2a26","Type":"ContainerDied","Data":"8cac6045fccfbe1e377f14b47042375c5f42ee5f5f23e9bb7094c2ffd4201d8c"} Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.689967 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.802289 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0db9aea1-8a66-4447-9150-812172da2a26-secret-volume\") pod \"0db9aea1-8a66-4447-9150-812172da2a26\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.802369 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0db9aea1-8a66-4447-9150-812172da2a26-config-volume\") pod \"0db9aea1-8a66-4447-9150-812172da2a26\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.802475 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qmx7\" (UniqueName: \"kubernetes.io/projected/0db9aea1-8a66-4447-9150-812172da2a26-kube-api-access-6qmx7\") pod \"0db9aea1-8a66-4447-9150-812172da2a26\" (UID: \"0db9aea1-8a66-4447-9150-812172da2a26\") " Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.804572 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0db9aea1-8a66-4447-9150-812172da2a26-config-volume" (OuterVolumeSpecName: "config-volume") pod "0db9aea1-8a66-4447-9150-812172da2a26" (UID: "0db9aea1-8a66-4447-9150-812172da2a26"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.809106 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db9aea1-8a66-4447-9150-812172da2a26-kube-api-access-6qmx7" (OuterVolumeSpecName: "kube-api-access-6qmx7") pod "0db9aea1-8a66-4447-9150-812172da2a26" (UID: "0db9aea1-8a66-4447-9150-812172da2a26"). InnerVolumeSpecName "kube-api-access-6qmx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.809959 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db9aea1-8a66-4447-9150-812172da2a26-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0db9aea1-8a66-4447-9150-812172da2a26" (UID: "0db9aea1-8a66-4447-9150-812172da2a26"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.905413 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0db9aea1-8a66-4447-9150-812172da2a26-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.905696 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0db9aea1-8a66-4447-9150-812172da2a26-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:15:03 crc kubenswrapper[4772]: I0127 17:15:03.905707 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qmx7\" (UniqueName: \"kubernetes.io/projected/0db9aea1-8a66-4447-9150-812172da2a26-kube-api-access-6qmx7\") on node \"crc\" DevicePath \"\"" Jan 27 17:15:04 crc kubenswrapper[4772]: I0127 17:15:04.296652 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" event={"ID":"0db9aea1-8a66-4447-9150-812172da2a26","Type":"ContainerDied","Data":"336e829070d890d5276fa9fc7334182d780e1d99b11bda8599d5a4ab4962b513"} Jan 27 17:15:04 crc kubenswrapper[4772]: I0127 17:15:04.297118 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="336e829070d890d5276fa9fc7334182d780e1d99b11bda8599d5a4ab4962b513" Jan 27 17:15:04 crc kubenswrapper[4772]: I0127 17:15:04.297275 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492235-blstm" Jan 27 17:15:04 crc kubenswrapper[4772]: I0127 17:15:04.371971 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9"] Jan 27 17:15:04 crc kubenswrapper[4772]: I0127 17:15:04.382282 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492190-nkwl9"] Jan 27 17:15:04 crc kubenswrapper[4772]: I0127 17:15:04.686301 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad11684-a5b7-4df1-9d18-5179c6113f66" path="/var/lib/kubelet/pods/aad11684-a5b7-4df1-9d18-5179c6113f66/volumes" Jan 27 17:15:10 crc kubenswrapper[4772]: I0127 17:15:10.713958 4772 scope.go:117] "RemoveContainer" containerID="114edebee04cbeb82762a8f7e28bf44b5665934fa2746ec43b3a0a20d9084515" Jan 27 17:15:12 crc kubenswrapper[4772]: I0127 17:15:12.068304 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:15:12 crc kubenswrapper[4772]: I0127 17:15:12.069137 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:15:42 crc kubenswrapper[4772]: I0127 17:15:42.058350 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:15:42 crc kubenswrapper[4772]: I0127 17:15:42.058933 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:16:12 crc kubenswrapper[4772]: I0127 17:16:12.058925 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:16:12 crc kubenswrapper[4772]: I0127 17:16:12.061254 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:16:12 crc kubenswrapper[4772]: I0127 17:16:12.061493 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:16:12 crc kubenswrapper[4772]: I0127 17:16:12.062763 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:16:12 crc kubenswrapper[4772]: I0127 17:16:12.063030 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" gracePeriod=600 Jan 27 17:16:12 crc kubenswrapper[4772]: E0127 17:16:12.211710 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:16:13 crc kubenswrapper[4772]: I0127 17:16:13.063960 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" exitCode=0 Jan 27 17:16:13 crc kubenswrapper[4772]: I0127 17:16:13.064011 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246"} Jan 27 17:16:13 crc kubenswrapper[4772]: I0127 17:16:13.064054 4772 scope.go:117] "RemoveContainer" containerID="ea5d30742e90c5fe244c125256ab670ebb26e9206885008781626f7b363771b3" Jan 27 17:16:13 crc kubenswrapper[4772]: I0127 17:16:13.064729 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:16:13 crc kubenswrapper[4772]: E0127 17:16:13.065001 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:16:25 crc kubenswrapper[4772]: I0127 17:16:25.663672 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:16:25 crc kubenswrapper[4772]: E0127 17:16:25.664507 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:16:36 crc kubenswrapper[4772]: I0127 17:16:36.663201 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:16:36 crc kubenswrapper[4772]: E0127 17:16:36.664605 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:16:51 crc kubenswrapper[4772]: I0127 17:16:51.664021 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:16:51 crc kubenswrapper[4772]: E0127 17:16:51.665752 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:17:03 crc kubenswrapper[4772]: I0127 17:17:03.664586 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:17:03 crc kubenswrapper[4772]: E0127 17:17:03.665736 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:17:14 crc kubenswrapper[4772]: I0127 17:17:14.673275 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:17:14 crc kubenswrapper[4772]: E0127 17:17:14.686014 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:17:28 crc kubenswrapper[4772]: I0127 17:17:28.664246 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:17:28 crc kubenswrapper[4772]: E0127 17:17:28.665460 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:17:43 crc kubenswrapper[4772]: I0127 17:17:43.663210 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:17:43 crc kubenswrapper[4772]: E0127 17:17:43.663964 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:17:57 crc kubenswrapper[4772]: I0127 17:17:57.662695 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:17:57 crc kubenswrapper[4772]: E0127 17:17:57.663444 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:18:09 crc kubenswrapper[4772]: I0127 17:18:09.703412 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:18:09 crc kubenswrapper[4772]: E0127 17:18:09.705790 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:18:20 crc kubenswrapper[4772]: I0127 17:18:20.664140 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:18:20 crc kubenswrapper[4772]: E0127 17:18:20.665381 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:18:33 crc kubenswrapper[4772]: I0127 17:18:33.663906 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:18:33 crc kubenswrapper[4772]: E0127 17:18:33.664822 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:18:44 crc kubenswrapper[4772]: I0127 17:18:44.676072 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:18:44 crc kubenswrapper[4772]: E0127 17:18:44.677465 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:18:58 crc kubenswrapper[4772]: I0127 17:18:58.664119 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:18:58 crc kubenswrapper[4772]: E0127 17:18:58.665361 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.738606 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sdbgg"] Jan 27 17:19:08 crc kubenswrapper[4772]: E0127 17:19:08.740951 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db9aea1-8a66-4447-9150-812172da2a26" containerName="collect-profiles" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.741058 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db9aea1-8a66-4447-9150-812172da2a26" containerName="collect-profiles" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.741407 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db9aea1-8a66-4447-9150-812172da2a26" containerName="collect-profiles" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.743673 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.750752 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdbgg"] Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.758733 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09211095-3894-4db1-bcea-29d1c2064979-utilities\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.758798 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgqr\" (UniqueName: \"kubernetes.io/projected/09211095-3894-4db1-bcea-29d1c2064979-kube-api-access-bpgqr\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.758945 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09211095-3894-4db1-bcea-29d1c2064979-catalog-content\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.860997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09211095-3894-4db1-bcea-29d1c2064979-utilities\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.861052 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgqr\" (UniqueName: \"kubernetes.io/projected/09211095-3894-4db1-bcea-29d1c2064979-kube-api-access-bpgqr\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.861129 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09211095-3894-4db1-bcea-29d1c2064979-catalog-content\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.861641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09211095-3894-4db1-bcea-29d1c2064979-utilities\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.861688 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09211095-3894-4db1-bcea-29d1c2064979-catalog-content\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:08 crc kubenswrapper[4772]: I0127 17:19:08.882526 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgqr\" (UniqueName: \"kubernetes.io/projected/09211095-3894-4db1-bcea-29d1c2064979-kube-api-access-bpgqr\") pod \"redhat-operators-sdbgg\" (UID: \"09211095-3894-4db1-bcea-29d1c2064979\") " pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:09 crc kubenswrapper[4772]: I0127 17:19:09.081084 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:09 crc kubenswrapper[4772]: I0127 17:19:09.559038 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdbgg"] Jan 27 17:19:09 crc kubenswrapper[4772]: I0127 17:19:09.993490 4772 generic.go:334] "Generic (PLEG): container finished" podID="09211095-3894-4db1-bcea-29d1c2064979" containerID="c7d1aeb8760ccc3859f31c865946184e19dcfb95f1b5a51ede194d6b5159fd54" exitCode=0 Jan 27 17:19:09 crc kubenswrapper[4772]: I0127 17:19:09.993587 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdbgg" event={"ID":"09211095-3894-4db1-bcea-29d1c2064979","Type":"ContainerDied","Data":"c7d1aeb8760ccc3859f31c865946184e19dcfb95f1b5a51ede194d6b5159fd54"} Jan 27 17:19:09 crc kubenswrapper[4772]: I0127 17:19:09.993904 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdbgg" event={"ID":"09211095-3894-4db1-bcea-29d1c2064979","Type":"ContainerStarted","Data":"7b9104e2450ae40c8ccedbf7fb5718b06fc3fd6643498b6b6b5fcd64d1d1eed1"} Jan 27 17:19:10 crc kubenswrapper[4772]: I0127 17:19:10.664894 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:19:10 crc kubenswrapper[4772]: E0127 17:19:10.665979 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:19:19 crc kubenswrapper[4772]: I0127 17:19:19.079048 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdbgg" event={"ID":"09211095-3894-4db1-bcea-29d1c2064979","Type":"ContainerStarted","Data":"03288c1630ff3a4aaf5cd2be5b75940caff281cc957f7c9a00544bea93505a5f"} Jan 27 17:19:21 crc kubenswrapper[4772]: I0127 17:19:21.099443 4772 generic.go:334] "Generic (PLEG): container finished" podID="09211095-3894-4db1-bcea-29d1c2064979" containerID="03288c1630ff3a4aaf5cd2be5b75940caff281cc957f7c9a00544bea93505a5f" exitCode=0 Jan 27 17:19:21 crc kubenswrapper[4772]: I0127 17:19:21.099502 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdbgg" event={"ID":"09211095-3894-4db1-bcea-29d1c2064979","Type":"ContainerDied","Data":"03288c1630ff3a4aaf5cd2be5b75940caff281cc957f7c9a00544bea93505a5f"} Jan 27 17:19:21 crc kubenswrapper[4772]: I0127 17:19:21.102691 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:19:22 crc kubenswrapper[4772]: I0127 17:19:22.110035 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sdbgg" event={"ID":"09211095-3894-4db1-bcea-29d1c2064979","Type":"ContainerStarted","Data":"9a0674be9539d95a9bb4caa1bb6e380e96fd25b87da2cf087c1d397de11bd988"} Jan 27 17:19:22 crc kubenswrapper[4772]: I0127 17:19:22.132060 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sdbgg" podStartSLOduration=2.382977571 podStartE2EDuration="14.132037977s" podCreationTimestamp="2026-01-27 17:19:08 +0000 UTC" firstStartedPulling="2026-01-27 17:19:09.994950226 +0000 UTC m=+7935.975559324" lastFinishedPulling="2026-01-27 17:19:21.744010632 +0000 UTC m=+7947.724619730" observedRunningTime="2026-01-27 17:19:22.127105597 +0000 UTC m=+7948.107714685" watchObservedRunningTime="2026-01-27 17:19:22.132037977 +0000 UTC m=+7948.112647065" Jan 27 17:19:24 crc kubenswrapper[4772]: I0127 17:19:24.675925 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:19:24 crc kubenswrapper[4772]: E0127 17:19:24.676545 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.081626 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.082135 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.128403 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.223576 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sdbgg" Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.286838 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sdbgg"] Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.364216 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5whzm"] Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.364491 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5whzm" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="registry-server" containerID="cri-o://a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" gracePeriod=2 Jan 27 17:19:29 crc kubenswrapper[4772]: E0127 17:19:29.758674 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c is running failed: container process not found" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 17:19:29 crc kubenswrapper[4772]: E0127 17:19:29.759495 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c is running failed: container process not found" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 17:19:29 crc kubenswrapper[4772]: E0127 17:19:29.759861 4772 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c is running failed: container process not found" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 17:19:29 crc kubenswrapper[4772]: E0127 17:19:29.759933 4772 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-5whzm" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="registry-server" Jan 27 17:19:29 crc kubenswrapper[4772]: I0127 17:19:29.813820 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.014558 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-utilities\") pod \"79b85747-dcbc-462d-85d1-3d00801b5106\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.014638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-catalog-content\") pod \"79b85747-dcbc-462d-85d1-3d00801b5106\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.014685 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr9pf\" (UniqueName: \"kubernetes.io/projected/79b85747-dcbc-462d-85d1-3d00801b5106-kube-api-access-vr9pf\") pod \"79b85747-dcbc-462d-85d1-3d00801b5106\" (UID: \"79b85747-dcbc-462d-85d1-3d00801b5106\") " Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.015093 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-utilities" (OuterVolumeSpecName: "utilities") pod "79b85747-dcbc-462d-85d1-3d00801b5106" (UID: "79b85747-dcbc-462d-85d1-3d00801b5106"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.015238 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.021221 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b85747-dcbc-462d-85d1-3d00801b5106-kube-api-access-vr9pf" (OuterVolumeSpecName: "kube-api-access-vr9pf") pod "79b85747-dcbc-462d-85d1-3d00801b5106" (UID: "79b85747-dcbc-462d-85d1-3d00801b5106"). InnerVolumeSpecName "kube-api-access-vr9pf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.116505 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr9pf\" (UniqueName: \"kubernetes.io/projected/79b85747-dcbc-462d-85d1-3d00801b5106-kube-api-access-vr9pf\") on node \"crc\" DevicePath \"\"" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.133854 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79b85747-dcbc-462d-85d1-3d00801b5106" (UID: "79b85747-dcbc-462d-85d1-3d00801b5106"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.183027 4772 generic.go:334] "Generic (PLEG): container finished" podID="79b85747-dcbc-462d-85d1-3d00801b5106" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" exitCode=0 Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.183070 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5whzm" event={"ID":"79b85747-dcbc-462d-85d1-3d00801b5106","Type":"ContainerDied","Data":"a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c"} Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.183130 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5whzm" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.183139 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5whzm" event={"ID":"79b85747-dcbc-462d-85d1-3d00801b5106","Type":"ContainerDied","Data":"8e4bc519d0ced9952d6857ff31015675ce4705ed12ce2547ebdba49c33fc4d62"} Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.183161 4772 scope.go:117] "RemoveContainer" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.212268 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5whzm"] Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.212281 4772 scope.go:117] "RemoveContainer" containerID="eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.218404 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79b85747-dcbc-462d-85d1-3d00801b5106-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.219239 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5whzm"] Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.235316 4772 scope.go:117] "RemoveContainer" containerID="d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.279994 4772 scope.go:117] "RemoveContainer" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" Jan 27 17:19:30 crc kubenswrapper[4772]: E0127 17:19:30.280471 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c\": container with ID starting with a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c not found: ID does not exist" containerID="a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.280514 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c"} err="failed to get container status \"a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c\": rpc error: code = NotFound desc = could not find container \"a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c\": container with ID starting with a2d56215a8255cd6fe0e2caab61dc965898bd5a40e9690f3cc4fda7a7884bd0c not found: ID does not exist" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.280543 4772 scope.go:117] "RemoveContainer" containerID="eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7" Jan 27 17:19:30 crc kubenswrapper[4772]: E0127 17:19:30.281006 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7\": container with ID starting with eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7 not found: ID does not exist" containerID="eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.281047 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7"} err="failed to get container status \"eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7\": rpc error: code = NotFound desc = could not find container \"eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7\": container with ID starting with eb48a00f974d6ceae0f8e67b809e0845663f47278e4609e58f35d13935712ff7 not found: ID does not exist" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.281074 4772 scope.go:117] "RemoveContainer" containerID="d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1" Jan 27 17:19:30 crc kubenswrapper[4772]: E0127 17:19:30.281378 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1\": container with ID starting with d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1 not found: ID does not exist" containerID="d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.281402 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1"} err="failed to get container status \"d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1\": rpc error: code = NotFound desc = could not find container \"d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1\": container with ID starting with d2c52e57f45d6596ec80d58984736144fe9463319975ceee95db9f926e0b38f1 not found: ID does not exist" Jan 27 17:19:30 crc kubenswrapper[4772]: I0127 17:19:30.674984 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" path="/var/lib/kubelet/pods/79b85747-dcbc-462d-85d1-3d00801b5106/volumes" Jan 27 17:19:39 crc kubenswrapper[4772]: I0127 17:19:39.663570 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:19:39 crc kubenswrapper[4772]: E0127 17:19:39.664897 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:19:52 crc kubenswrapper[4772]: I0127 17:19:52.672070 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:19:52 crc kubenswrapper[4772]: E0127 17:19:52.672987 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:20:03 crc kubenswrapper[4772]: I0127 17:20:03.663711 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:20:03 crc kubenswrapper[4772]: E0127 17:20:03.664857 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:20:16 crc kubenswrapper[4772]: I0127 17:20:16.664057 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:20:16 crc kubenswrapper[4772]: E0127 17:20:16.667039 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:20:31 crc kubenswrapper[4772]: I0127 17:20:31.663567 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:20:31 crc kubenswrapper[4772]: E0127 17:20:31.664787 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:20:43 crc kubenswrapper[4772]: I0127 17:20:43.663023 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:20:43 crc kubenswrapper[4772]: E0127 17:20:43.663895 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:20:57 crc kubenswrapper[4772]: I0127 17:20:57.663581 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:20:57 crc kubenswrapper[4772]: E0127 17:20:57.664478 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:21:11 crc kubenswrapper[4772]: I0127 17:21:11.662557 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:21:11 crc kubenswrapper[4772]: E0127 17:21:11.663218 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:21:26 crc kubenswrapper[4772]: I0127 17:21:26.663724 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:21:27 crc kubenswrapper[4772]: I0127 17:21:27.279926 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"a0596c152e2f2b0802267f33c0f2d3224ef4bcfda5d22940bb1eb3256403bf5f"} Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.634820 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xvdq"] Jan 27 17:22:33 crc kubenswrapper[4772]: E0127 17:22:33.635862 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="extract-utilities" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.635881 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="extract-utilities" Jan 27 17:22:33 crc kubenswrapper[4772]: E0127 17:22:33.635911 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="extract-content" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.635920 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="extract-content" Jan 27 17:22:33 crc kubenswrapper[4772]: E0127 17:22:33.635934 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="registry-server" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.635943 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="registry-server" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.636157 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b85747-dcbc-462d-85d1-3d00801b5106" containerName="registry-server" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.637836 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.646380 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xvdq"] Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.728411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-utilities\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.728504 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fcfk\" (UniqueName: \"kubernetes.io/projected/6b2a1877-960b-4e44-8e6d-47744d3e764b-kube-api-access-9fcfk\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.728594 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-catalog-content\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.831006 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-utilities\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.831087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fcfk\" (UniqueName: \"kubernetes.io/projected/6b2a1877-960b-4e44-8e6d-47744d3e764b-kube-api-access-9fcfk\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.831146 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-catalog-content\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.831688 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-catalog-content\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.832288 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-utilities\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.852870 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fcfk\" (UniqueName: \"kubernetes.io/projected/6b2a1877-960b-4e44-8e6d-47744d3e764b-kube-api-access-9fcfk\") pod \"redhat-marketplace-4xvdq\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:33 crc kubenswrapper[4772]: I0127 17:22:33.954320 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:34 crc kubenswrapper[4772]: I0127 17:22:34.489937 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xvdq"] Jan 27 17:22:34 crc kubenswrapper[4772]: I0127 17:22:34.971079 4772 generic.go:334] "Generic (PLEG): container finished" podID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerID="08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6" exitCode=0 Jan 27 17:22:34 crc kubenswrapper[4772]: I0127 17:22:34.971124 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerDied","Data":"08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6"} Jan 27 17:22:34 crc kubenswrapper[4772]: I0127 17:22:34.971154 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerStarted","Data":"aee6dea538b507f3e6d4df03a2d19ed1bfe49288afa763d30e38e9301cce0362"} Jan 27 17:22:35 crc kubenswrapper[4772]: I0127 17:22:35.982715 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerStarted","Data":"e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f"} Jan 27 17:22:36 crc kubenswrapper[4772]: I0127 17:22:36.995907 4772 generic.go:334] "Generic (PLEG): container finished" podID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerID="e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f" exitCode=0 Jan 27 17:22:36 crc kubenswrapper[4772]: I0127 17:22:36.996161 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerDied","Data":"e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f"} Jan 27 17:22:37 crc kubenswrapper[4772]: I0127 17:22:37.829344 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c894q"] Jan 27 17:22:37 crc kubenswrapper[4772]: I0127 17:22:37.831620 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:37 crc kubenswrapper[4772]: I0127 17:22:37.843365 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c894q"] Jan 27 17:22:37 crc kubenswrapper[4772]: I0127 17:22:37.918998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-catalog-content\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:37 crc kubenswrapper[4772]: I0127 17:22:37.919115 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsv8q\" (UniqueName: \"kubernetes.io/projected/79767c4a-3d92-40a5-8128-a9b7785d4672-kube-api-access-vsv8q\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:37 crc kubenswrapper[4772]: I0127 17:22:37.919151 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-utilities\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.020197 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-catalog-content\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.020237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerStarted","Data":"313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2"} Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.020600 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsv8q\" (UniqueName: \"kubernetes.io/projected/79767c4a-3d92-40a5-8128-a9b7785d4672-kube-api-access-vsv8q\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.020633 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-utilities\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.020671 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-catalog-content\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.021564 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-utilities\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.041936 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsv8q\" (UniqueName: \"kubernetes.io/projected/79767c4a-3d92-40a5-8128-a9b7785d4672-kube-api-access-vsv8q\") pod \"community-operators-c894q\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.048279 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xvdq" podStartSLOduration=2.5936443909999998 podStartE2EDuration="5.048262827s" podCreationTimestamp="2026-01-27 17:22:33 +0000 UTC" firstStartedPulling="2026-01-27 17:22:34.973435452 +0000 UTC m=+8140.954044560" lastFinishedPulling="2026-01-27 17:22:37.428053858 +0000 UTC m=+8143.408662996" observedRunningTime="2026-01-27 17:22:38.039234861 +0000 UTC m=+8144.019843979" watchObservedRunningTime="2026-01-27 17:22:38.048262827 +0000 UTC m=+8144.028871915" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.158239 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:38 crc kubenswrapper[4772]: I0127 17:22:38.726001 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c894q"] Jan 27 17:22:38 crc kubenswrapper[4772]: W0127 17:22:38.726467 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79767c4a_3d92_40a5_8128_a9b7785d4672.slice/crio-64cc0b646724882690474db4430de2f2239d9f2a6e5ce0a0dcddb5c0fc527179 WatchSource:0}: Error finding container 64cc0b646724882690474db4430de2f2239d9f2a6e5ce0a0dcddb5c0fc527179: Status 404 returned error can't find the container with id 64cc0b646724882690474db4430de2f2239d9f2a6e5ce0a0dcddb5c0fc527179 Jan 27 17:22:39 crc kubenswrapper[4772]: I0127 17:22:39.031837 4772 generic.go:334] "Generic (PLEG): container finished" podID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerID="b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0" exitCode=0 Jan 27 17:22:39 crc kubenswrapper[4772]: I0127 17:22:39.031965 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c894q" event={"ID":"79767c4a-3d92-40a5-8128-a9b7785d4672","Type":"ContainerDied","Data":"b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0"} Jan 27 17:22:39 crc kubenswrapper[4772]: I0127 17:22:39.032212 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c894q" event={"ID":"79767c4a-3d92-40a5-8128-a9b7785d4672","Type":"ContainerStarted","Data":"64cc0b646724882690474db4430de2f2239d9f2a6e5ce0a0dcddb5c0fc527179"} Jan 27 17:22:41 crc kubenswrapper[4772]: I0127 17:22:41.053926 4772 generic.go:334] "Generic (PLEG): container finished" podID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerID="931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad" exitCode=0 Jan 27 17:22:41 crc kubenswrapper[4772]: I0127 17:22:41.054012 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c894q" event={"ID":"79767c4a-3d92-40a5-8128-a9b7785d4672","Type":"ContainerDied","Data":"931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad"} Jan 27 17:22:42 crc kubenswrapper[4772]: I0127 17:22:42.066571 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c894q" event={"ID":"79767c4a-3d92-40a5-8128-a9b7785d4672","Type":"ContainerStarted","Data":"227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a"} Jan 27 17:22:42 crc kubenswrapper[4772]: I0127 17:22:42.099097 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c894q" podStartSLOduration=2.420509466 podStartE2EDuration="5.099072773s" podCreationTimestamp="2026-01-27 17:22:37 +0000 UTC" firstStartedPulling="2026-01-27 17:22:39.034523379 +0000 UTC m=+8145.015132487" lastFinishedPulling="2026-01-27 17:22:41.713086656 +0000 UTC m=+8147.693695794" observedRunningTime="2026-01-27 17:22:42.096549011 +0000 UTC m=+8148.077158139" watchObservedRunningTime="2026-01-27 17:22:42.099072773 +0000 UTC m=+8148.079681881" Jan 27 17:22:43 crc kubenswrapper[4772]: I0127 17:22:43.956423 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:43 crc kubenswrapper[4772]: I0127 17:22:43.958103 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:44 crc kubenswrapper[4772]: I0127 17:22:44.040757 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:44 crc kubenswrapper[4772]: I0127 17:22:44.136771 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:45 crc kubenswrapper[4772]: I0127 17:22:45.425679 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xvdq"] Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.102070 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xvdq" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="registry-server" containerID="cri-o://313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2" gracePeriod=2 Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.595438 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.733506 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-utilities\") pod \"6b2a1877-960b-4e44-8e6d-47744d3e764b\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.733590 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-catalog-content\") pod \"6b2a1877-960b-4e44-8e6d-47744d3e764b\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.733751 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fcfk\" (UniqueName: \"kubernetes.io/projected/6b2a1877-960b-4e44-8e6d-47744d3e764b-kube-api-access-9fcfk\") pod \"6b2a1877-960b-4e44-8e6d-47744d3e764b\" (UID: \"6b2a1877-960b-4e44-8e6d-47744d3e764b\") " Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.734884 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-utilities" (OuterVolumeSpecName: "utilities") pod "6b2a1877-960b-4e44-8e6d-47744d3e764b" (UID: "6b2a1877-960b-4e44-8e6d-47744d3e764b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.742916 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b2a1877-960b-4e44-8e6d-47744d3e764b-kube-api-access-9fcfk" (OuterVolumeSpecName: "kube-api-access-9fcfk") pod "6b2a1877-960b-4e44-8e6d-47744d3e764b" (UID: "6b2a1877-960b-4e44-8e6d-47744d3e764b"). InnerVolumeSpecName "kube-api-access-9fcfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.782964 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b2a1877-960b-4e44-8e6d-47744d3e764b" (UID: "6b2a1877-960b-4e44-8e6d-47744d3e764b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.839898 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.840139 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b2a1877-960b-4e44-8e6d-47744d3e764b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:22:46 crc kubenswrapper[4772]: I0127 17:22:46.840323 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fcfk\" (UniqueName: \"kubernetes.io/projected/6b2a1877-960b-4e44-8e6d-47744d3e764b-kube-api-access-9fcfk\") on node \"crc\" DevicePath \"\"" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.117406 4772 generic.go:334] "Generic (PLEG): container finished" podID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerID="313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2" exitCode=0 Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.117474 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerDied","Data":"313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2"} Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.117524 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xvdq" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.117557 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xvdq" event={"ID":"6b2a1877-960b-4e44-8e6d-47744d3e764b","Type":"ContainerDied","Data":"aee6dea538b507f3e6d4df03a2d19ed1bfe49288afa763d30e38e9301cce0362"} Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.117589 4772 scope.go:117] "RemoveContainer" containerID="313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.146906 4772 scope.go:117] "RemoveContainer" containerID="e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.173879 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xvdq"] Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.190749 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xvdq"] Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.198553 4772 scope.go:117] "RemoveContainer" containerID="08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.241630 4772 scope.go:117] "RemoveContainer" containerID="313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2" Jan 27 17:22:47 crc kubenswrapper[4772]: E0127 17:22:47.242380 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2\": container with ID starting with 313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2 not found: ID does not exist" containerID="313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.242451 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2"} err="failed to get container status \"313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2\": rpc error: code = NotFound desc = could not find container \"313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2\": container with ID starting with 313eb3ad4fc2fcb56fe6d4828f016dd448a0cc926ce0e656cd2cc46796002ab2 not found: ID does not exist" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.242479 4772 scope.go:117] "RemoveContainer" containerID="e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f" Jan 27 17:22:47 crc kubenswrapper[4772]: E0127 17:22:47.242999 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f\": container with ID starting with e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f not found: ID does not exist" containerID="e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.243054 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f"} err="failed to get container status \"e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f\": rpc error: code = NotFound desc = could not find container \"e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f\": container with ID starting with e682f9deaaadeb63ad6e4ae31c623a39a070781b1ba79a28bca994caba62c19f not found: ID does not exist" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.243087 4772 scope.go:117] "RemoveContainer" containerID="08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6" Jan 27 17:22:47 crc kubenswrapper[4772]: E0127 17:22:47.243529 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6\": container with ID starting with 08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6 not found: ID does not exist" containerID="08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6" Jan 27 17:22:47 crc kubenswrapper[4772]: I0127 17:22:47.243564 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6"} err="failed to get container status \"08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6\": rpc error: code = NotFound desc = could not find container \"08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6\": container with ID starting with 08f7f6255d2bb42989667d0a5e1713a33ba4ba31840b81d1925d49697b9af8c6 not found: ID does not exist" Jan 27 17:22:48 crc kubenswrapper[4772]: I0127 17:22:48.158762 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:48 crc kubenswrapper[4772]: I0127 17:22:48.158881 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:48 crc kubenswrapper[4772]: I0127 17:22:48.206147 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:48 crc kubenswrapper[4772]: I0127 17:22:48.674597 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" path="/var/lib/kubelet/pods/6b2a1877-960b-4e44-8e6d-47744d3e764b/volumes" Jan 27 17:22:49 crc kubenswrapper[4772]: I0127 17:22:49.184847 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:49 crc kubenswrapper[4772]: I0127 17:22:49.429355 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c894q"] Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.164669 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c894q" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="registry-server" containerID="cri-o://227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a" gracePeriod=2 Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.712243 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.753892 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsv8q\" (UniqueName: \"kubernetes.io/projected/79767c4a-3d92-40a5-8128-a9b7785d4672-kube-api-access-vsv8q\") pod \"79767c4a-3d92-40a5-8128-a9b7785d4672\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.754002 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-catalog-content\") pod \"79767c4a-3d92-40a5-8128-a9b7785d4672\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.754089 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-utilities\") pod \"79767c4a-3d92-40a5-8128-a9b7785d4672\" (UID: \"79767c4a-3d92-40a5-8128-a9b7785d4672\") " Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.755110 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-utilities" (OuterVolumeSpecName: "utilities") pod "79767c4a-3d92-40a5-8128-a9b7785d4672" (UID: "79767c4a-3d92-40a5-8128-a9b7785d4672"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.761250 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79767c4a-3d92-40a5-8128-a9b7785d4672-kube-api-access-vsv8q" (OuterVolumeSpecName: "kube-api-access-vsv8q") pod "79767c4a-3d92-40a5-8128-a9b7785d4672" (UID: "79767c4a-3d92-40a5-8128-a9b7785d4672"). InnerVolumeSpecName "kube-api-access-vsv8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.811524 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79767c4a-3d92-40a5-8128-a9b7785d4672" (UID: "79767c4a-3d92-40a5-8128-a9b7785d4672"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.855478 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.855506 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79767c4a-3d92-40a5-8128-a9b7785d4672-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:22:51 crc kubenswrapper[4772]: I0127 17:22:51.855515 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsv8q\" (UniqueName: \"kubernetes.io/projected/79767c4a-3d92-40a5-8128-a9b7785d4672-kube-api-access-vsv8q\") on node \"crc\" DevicePath \"\"" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.216277 4772 generic.go:334] "Generic (PLEG): container finished" podID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerID="227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a" exitCode=0 Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.216339 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c894q" event={"ID":"79767c4a-3d92-40a5-8128-a9b7785d4672","Type":"ContainerDied","Data":"227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a"} Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.216381 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c894q" event={"ID":"79767c4a-3d92-40a5-8128-a9b7785d4672","Type":"ContainerDied","Data":"64cc0b646724882690474db4430de2f2239d9f2a6e5ce0a0dcddb5c0fc527179"} Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.216412 4772 scope.go:117] "RemoveContainer" containerID="227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.216380 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c894q" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.265683 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c894q"] Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.275981 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c894q"] Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.485355 4772 scope.go:117] "RemoveContainer" containerID="931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.526225 4772 scope.go:117] "RemoveContainer" containerID="b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.568123 4772 scope.go:117] "RemoveContainer" containerID="227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a" Jan 27 17:22:52 crc kubenswrapper[4772]: E0127 17:22:52.569243 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a\": container with ID starting with 227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a not found: ID does not exist" containerID="227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.569348 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a"} err="failed to get container status \"227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a\": rpc error: code = NotFound desc = could not find container \"227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a\": container with ID starting with 227375896e50ca3ae2a98fd56581e6e897945772df9f8a7d35dcddb59e16d22a not found: ID does not exist" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.569408 4772 scope.go:117] "RemoveContainer" containerID="931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad" Jan 27 17:22:52 crc kubenswrapper[4772]: E0127 17:22:52.569886 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad\": container with ID starting with 931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad not found: ID does not exist" containerID="931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.569929 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad"} err="failed to get container status \"931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad\": rpc error: code = NotFound desc = could not find container \"931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad\": container with ID starting with 931a3ea833e76451e7ae9443d7919ff52b6d3c292558ce3eac47358e7f05fdad not found: ID does not exist" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.569955 4772 scope.go:117] "RemoveContainer" containerID="b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0" Jan 27 17:22:52 crc kubenswrapper[4772]: E0127 17:22:52.570375 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0\": container with ID starting with b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0 not found: ID does not exist" containerID="b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.570411 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0"} err="failed to get container status \"b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0\": rpc error: code = NotFound desc = could not find container \"b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0\": container with ID starting with b7996be57a39eb18a216e3d76b272bae729bf867e09dc88e02c79f2bbf29e2c0 not found: ID does not exist" Jan 27 17:22:52 crc kubenswrapper[4772]: I0127 17:22:52.675589 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" path="/var/lib/kubelet/pods/79767c4a-3d92-40a5-8128-a9b7785d4672/volumes" Jan 27 17:23:42 crc kubenswrapper[4772]: I0127 17:23:42.058936 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:23:42 crc kubenswrapper[4772]: I0127 17:23:42.059571 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:24:12 crc kubenswrapper[4772]: I0127 17:24:12.058274 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:24:12 crc kubenswrapper[4772]: I0127 17:24:12.059160 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.058880 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.059341 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.059383 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.060157 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0596c152e2f2b0802267f33c0f2d3224ef4bcfda5d22940bb1eb3256403bf5f"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.060292 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://a0596c152e2f2b0802267f33c0f2d3224ef4bcfda5d22940bb1eb3256403bf5f" gracePeriod=600 Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.446041 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="a0596c152e2f2b0802267f33c0f2d3224ef4bcfda5d22940bb1eb3256403bf5f" exitCode=0 Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.446121 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"a0596c152e2f2b0802267f33c0f2d3224ef4bcfda5d22940bb1eb3256403bf5f"} Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.446504 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284"} Jan 27 17:24:42 crc kubenswrapper[4772]: I0127 17:24:42.446531 4772 scope.go:117] "RemoveContainer" containerID="01301e7d30d90ee5d4e73d016fb4f9b1d80c5e9139db297ce836d144473d7246" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.746201 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vf8r7"] Jan 27 17:25:19 crc kubenswrapper[4772]: E0127 17:25:19.747634 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="extract-content" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.747660 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="extract-content" Jan 27 17:25:19 crc kubenswrapper[4772]: E0127 17:25:19.747696 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="extract-utilities" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.747709 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="extract-utilities" Jan 27 17:25:19 crc kubenswrapper[4772]: E0127 17:25:19.747727 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="registry-server" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.747737 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="registry-server" Jan 27 17:25:19 crc kubenswrapper[4772]: E0127 17:25:19.747759 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="extract-utilities" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.747768 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="extract-utilities" Jan 27 17:25:19 crc kubenswrapper[4772]: E0127 17:25:19.747791 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="extract-content" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.747799 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="extract-content" Jan 27 17:25:19 crc kubenswrapper[4772]: E0127 17:25:19.747826 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="registry-server" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.747865 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="registry-server" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.748106 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="79767c4a-3d92-40a5-8128-a9b7785d4672" containerName="registry-server" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.748128 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b2a1877-960b-4e44-8e6d-47744d3e764b" containerName="registry-server" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.749781 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.776218 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vf8r7"] Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.833133 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjq7m\" (UniqueName: \"kubernetes.io/projected/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-kube-api-access-zjq7m\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.833403 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-utilities\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.833565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-catalog-content\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.935031 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-utilities\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.935114 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-catalog-content\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.935182 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjq7m\" (UniqueName: \"kubernetes.io/projected/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-kube-api-access-zjq7m\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.935635 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-utilities\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.935737 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-catalog-content\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:19 crc kubenswrapper[4772]: I0127 17:25:19.970996 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjq7m\" (UniqueName: \"kubernetes.io/projected/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-kube-api-access-zjq7m\") pod \"certified-operators-vf8r7\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:20 crc kubenswrapper[4772]: I0127 17:25:20.107961 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:20 crc kubenswrapper[4772]: I0127 17:25:20.689655 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vf8r7"] Jan 27 17:25:20 crc kubenswrapper[4772]: W0127 17:25:20.704360 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d7e227_3c40_4173_abf4_7c1ee4aee5d2.slice/crio-a113e824463a2b79f76c2d98224ce0ade7752003b9acd3f5ff9e304b1ff68a8a WatchSource:0}: Error finding container a113e824463a2b79f76c2d98224ce0ade7752003b9acd3f5ff9e304b1ff68a8a: Status 404 returned error can't find the container with id a113e824463a2b79f76c2d98224ce0ade7752003b9acd3f5ff9e304b1ff68a8a Jan 27 17:25:20 crc kubenswrapper[4772]: I0127 17:25:20.880345 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerStarted","Data":"a113e824463a2b79f76c2d98224ce0ade7752003b9acd3f5ff9e304b1ff68a8a"} Jan 27 17:25:21 crc kubenswrapper[4772]: I0127 17:25:21.892553 4772 generic.go:334] "Generic (PLEG): container finished" podID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerID="aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20" exitCode=0 Jan 27 17:25:21 crc kubenswrapper[4772]: I0127 17:25:21.892644 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerDied","Data":"aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20"} Jan 27 17:25:21 crc kubenswrapper[4772]: I0127 17:25:21.895303 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:25:22 crc kubenswrapper[4772]: I0127 17:25:22.904329 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerStarted","Data":"5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757"} Jan 27 17:25:23 crc kubenswrapper[4772]: I0127 17:25:23.919259 4772 generic.go:334] "Generic (PLEG): container finished" podID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerID="5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757" exitCode=0 Jan 27 17:25:23 crc kubenswrapper[4772]: I0127 17:25:23.919357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerDied","Data":"5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757"} Jan 27 17:25:24 crc kubenswrapper[4772]: I0127 17:25:24.929407 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerStarted","Data":"3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1"} Jan 27 17:25:30 crc kubenswrapper[4772]: I0127 17:25:30.108794 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:30 crc kubenswrapper[4772]: I0127 17:25:30.109632 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:30 crc kubenswrapper[4772]: I0127 17:25:30.169259 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:30 crc kubenswrapper[4772]: I0127 17:25:30.193250 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vf8r7" podStartSLOduration=8.720730494 podStartE2EDuration="11.193234716s" podCreationTimestamp="2026-01-27 17:25:19 +0000 UTC" firstStartedPulling="2026-01-27 17:25:21.895058081 +0000 UTC m=+8307.875667179" lastFinishedPulling="2026-01-27 17:25:24.367562273 +0000 UTC m=+8310.348171401" observedRunningTime="2026-01-27 17:25:24.956940442 +0000 UTC m=+8310.937549580" watchObservedRunningTime="2026-01-27 17:25:30.193234716 +0000 UTC m=+8316.173843814" Jan 27 17:25:31 crc kubenswrapper[4772]: I0127 17:25:31.052283 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:31 crc kubenswrapper[4772]: I0127 17:25:31.104637 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vf8r7"] Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.012474 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vf8r7" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="registry-server" containerID="cri-o://3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1" gracePeriod=2 Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.580406 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.617123 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-catalog-content\") pod \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.617400 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-utilities\") pod \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.617579 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjq7m\" (UniqueName: \"kubernetes.io/projected/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-kube-api-access-zjq7m\") pod \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\" (UID: \"46d7e227-3c40-4173-abf4-7c1ee4aee5d2\") " Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.618344 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-utilities" (OuterVolumeSpecName: "utilities") pod "46d7e227-3c40-4173-abf4-7c1ee4aee5d2" (UID: "46d7e227-3c40-4173-abf4-7c1ee4aee5d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.623508 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-kube-api-access-zjq7m" (OuterVolumeSpecName: "kube-api-access-zjq7m") pod "46d7e227-3c40-4173-abf4-7c1ee4aee5d2" (UID: "46d7e227-3c40-4173-abf4-7c1ee4aee5d2"). InnerVolumeSpecName "kube-api-access-zjq7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.669113 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46d7e227-3c40-4173-abf4-7c1ee4aee5d2" (UID: "46d7e227-3c40-4173-abf4-7c1ee4aee5d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.719467 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.719501 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjq7m\" (UniqueName: \"kubernetes.io/projected/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-kube-api-access-zjq7m\") on node \"crc\" DevicePath \"\"" Jan 27 17:25:33 crc kubenswrapper[4772]: I0127 17:25:33.719512 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46d7e227-3c40-4173-abf4-7c1ee4aee5d2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.025126 4772 generic.go:334] "Generic (PLEG): container finished" podID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerID="3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1" exitCode=0 Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.025194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerDied","Data":"3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1"} Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.025241 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf8r7" event={"ID":"46d7e227-3c40-4173-abf4-7c1ee4aee5d2","Type":"ContainerDied","Data":"a113e824463a2b79f76c2d98224ce0ade7752003b9acd3f5ff9e304b1ff68a8a"} Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.025262 4772 scope.go:117] "RemoveContainer" containerID="3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.025309 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf8r7" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.046483 4772 scope.go:117] "RemoveContainer" containerID="5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.079458 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vf8r7"] Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.091238 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vf8r7"] Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.098152 4772 scope.go:117] "RemoveContainer" containerID="aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.158550 4772 scope.go:117] "RemoveContainer" containerID="3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1" Jan 27 17:25:34 crc kubenswrapper[4772]: E0127 17:25:34.159143 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1\": container with ID starting with 3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1 not found: ID does not exist" containerID="3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.159219 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1"} err="failed to get container status \"3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1\": rpc error: code = NotFound desc = could not find container \"3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1\": container with ID starting with 3b09782714157f33427572d6bbbe2fb46604581cbefffd50d34df0f457248ad1 not found: ID does not exist" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.159253 4772 scope.go:117] "RemoveContainer" containerID="5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757" Jan 27 17:25:34 crc kubenswrapper[4772]: E0127 17:25:34.159766 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757\": container with ID starting with 5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757 not found: ID does not exist" containerID="5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.159831 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757"} err="failed to get container status \"5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757\": rpc error: code = NotFound desc = could not find container \"5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757\": container with ID starting with 5f5443e55691e58714f2473c9100f2320f98b4ed0e3962a5ebdf967c7fe05757 not found: ID does not exist" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.159873 4772 scope.go:117] "RemoveContainer" containerID="aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20" Jan 27 17:25:34 crc kubenswrapper[4772]: E0127 17:25:34.160609 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20\": container with ID starting with aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20 not found: ID does not exist" containerID="aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.160651 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20"} err="failed to get container status \"aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20\": rpc error: code = NotFound desc = could not find container \"aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20\": container with ID starting with aad0b49b76f8d36172eb452dff4c5d1765e0da461be5445c32ece6a3405d8b20 not found: ID does not exist" Jan 27 17:25:34 crc kubenswrapper[4772]: I0127 17:25:34.681985 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" path="/var/lib/kubelet/pods/46d7e227-3c40-4173-abf4-7c1ee4aee5d2/volumes" Jan 27 17:26:42 crc kubenswrapper[4772]: I0127 17:26:42.059356 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:26:42 crc kubenswrapper[4772]: I0127 17:26:42.060369 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:27:12 crc kubenswrapper[4772]: I0127 17:27:12.058499 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:27:12 crc kubenswrapper[4772]: I0127 17:27:12.059152 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.058788 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.059656 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.059740 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.061101 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.061255 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" gracePeriod=600 Jan 27 17:27:42 crc kubenswrapper[4772]: E0127 17:27:42.198427 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.415531 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" exitCode=0 Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.415581 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284"} Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.415937 4772 scope.go:117] "RemoveContainer" containerID="a0596c152e2f2b0802267f33c0f2d3224ef4bcfda5d22940bb1eb3256403bf5f" Jan 27 17:27:42 crc kubenswrapper[4772]: I0127 17:27:42.416522 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:27:42 crc kubenswrapper[4772]: E0127 17:27:42.416845 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:27:57 crc kubenswrapper[4772]: I0127 17:27:57.663811 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:27:57 crc kubenswrapper[4772]: E0127 17:27:57.665128 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:28:10 crc kubenswrapper[4772]: I0127 17:28:10.663501 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:28:10 crc kubenswrapper[4772]: E0127 17:28:10.664850 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:28:25 crc kubenswrapper[4772]: I0127 17:28:25.663626 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:28:25 crc kubenswrapper[4772]: E0127 17:28:25.665033 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:28:40 crc kubenswrapper[4772]: I0127 17:28:40.663877 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:28:40 crc kubenswrapper[4772]: E0127 17:28:40.665254 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:28:53 crc kubenswrapper[4772]: I0127 17:28:53.664715 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:28:53 crc kubenswrapper[4772]: E0127 17:28:53.665869 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:29:04 crc kubenswrapper[4772]: I0127 17:29:04.675951 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:29:04 crc kubenswrapper[4772]: E0127 17:29:04.676858 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:29:18 crc kubenswrapper[4772]: I0127 17:29:18.664283 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:29:18 crc kubenswrapper[4772]: E0127 17:29:18.665623 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.040550 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r5pg8"] Jan 27 17:29:29 crc kubenswrapper[4772]: E0127 17:29:29.041754 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="extract-content" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.041774 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="extract-content" Jan 27 17:29:29 crc kubenswrapper[4772]: E0127 17:29:29.041828 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="registry-server" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.041841 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="registry-server" Jan 27 17:29:29 crc kubenswrapper[4772]: E0127 17:29:29.041861 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="extract-utilities" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.041873 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="extract-utilities" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.042142 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d7e227-3c40-4173-abf4-7c1ee4aee5d2" containerName="registry-server" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.044435 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.059476 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5pg8"] Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.187369 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-utilities\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.187440 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45llf\" (UniqueName: \"kubernetes.io/projected/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-kube-api-access-45llf\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.187460 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-catalog-content\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.289834 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-utilities\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.289930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45llf\" (UniqueName: \"kubernetes.io/projected/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-kube-api-access-45llf\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.289956 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-catalog-content\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.290467 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-catalog-content\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.290731 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-utilities\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.323718 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45llf\" (UniqueName: \"kubernetes.io/projected/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-kube-api-access-45llf\") pod \"redhat-operators-r5pg8\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.373765 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:29 crc kubenswrapper[4772]: I0127 17:29:29.835496 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5pg8"] Jan 27 17:29:30 crc kubenswrapper[4772]: I0127 17:29:30.611830 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerID="726824e909b709ea955048da1b47107c917a9ead2621a9e65cc30363053366cb" exitCode=0 Jan 27 17:29:30 crc kubenswrapper[4772]: I0127 17:29:30.611937 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerDied","Data":"726824e909b709ea955048da1b47107c917a9ead2621a9e65cc30363053366cb"} Jan 27 17:29:30 crc kubenswrapper[4772]: I0127 17:29:30.612144 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerStarted","Data":"4880d61008d58f60627606eb86b9a7ff700103c4a3a917606afb4f4e5637a700"} Jan 27 17:29:31 crc kubenswrapper[4772]: I0127 17:29:31.631004 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerStarted","Data":"be4fbbd1741965f3fcecf5b9a29d8a1a7d7296f88ab213fffd113439cfaa68b6"} Jan 27 17:29:32 crc kubenswrapper[4772]: I0127 17:29:32.663259 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:29:32 crc kubenswrapper[4772]: E0127 17:29:32.664042 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:29:33 crc kubenswrapper[4772]: I0127 17:29:33.659371 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerID="be4fbbd1741965f3fcecf5b9a29d8a1a7d7296f88ab213fffd113439cfaa68b6" exitCode=0 Jan 27 17:29:33 crc kubenswrapper[4772]: I0127 17:29:33.659468 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerDied","Data":"be4fbbd1741965f3fcecf5b9a29d8a1a7d7296f88ab213fffd113439cfaa68b6"} Jan 27 17:29:35 crc kubenswrapper[4772]: I0127 17:29:35.683556 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerStarted","Data":"024773fe5b0145280bf522bfd231862b43616d6d3e7c272e497743d5b82823f5"} Jan 27 17:29:35 crc kubenswrapper[4772]: I0127 17:29:35.726123 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r5pg8" podStartSLOduration=4.134854132 podStartE2EDuration="7.726093374s" podCreationTimestamp="2026-01-27 17:29:28 +0000 UTC" firstStartedPulling="2026-01-27 17:29:30.614228988 +0000 UTC m=+8556.594838086" lastFinishedPulling="2026-01-27 17:29:34.20546819 +0000 UTC m=+8560.186077328" observedRunningTime="2026-01-27 17:29:35.713882944 +0000 UTC m=+8561.694492072" watchObservedRunningTime="2026-01-27 17:29:35.726093374 +0000 UTC m=+8561.706702512" Jan 27 17:29:39 crc kubenswrapper[4772]: I0127 17:29:39.374513 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:39 crc kubenswrapper[4772]: I0127 17:29:39.374985 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:40 crc kubenswrapper[4772]: I0127 17:29:40.427517 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r5pg8" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="registry-server" probeResult="failure" output=< Jan 27 17:29:40 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 17:29:40 crc kubenswrapper[4772]: > Jan 27 17:29:45 crc kubenswrapper[4772]: I0127 17:29:45.664228 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:29:45 crc kubenswrapper[4772]: E0127 17:29:45.665588 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:29:49 crc kubenswrapper[4772]: I0127 17:29:49.461186 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:49 crc kubenswrapper[4772]: I0127 17:29:49.547888 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:49 crc kubenswrapper[4772]: I0127 17:29:49.707600 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5pg8"] Jan 27 17:29:50 crc kubenswrapper[4772]: I0127 17:29:50.840796 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r5pg8" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="registry-server" containerID="cri-o://024773fe5b0145280bf522bfd231862b43616d6d3e7c272e497743d5b82823f5" gracePeriod=2 Jan 27 17:29:51 crc kubenswrapper[4772]: I0127 17:29:51.852970 4772 generic.go:334] "Generic (PLEG): container finished" podID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerID="024773fe5b0145280bf522bfd231862b43616d6d3e7c272e497743d5b82823f5" exitCode=0 Jan 27 17:29:51 crc kubenswrapper[4772]: I0127 17:29:51.853024 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerDied","Data":"024773fe5b0145280bf522bfd231862b43616d6d3e7c272e497743d5b82823f5"} Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.286235 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.341794 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-catalog-content\") pod \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.341854 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-utilities\") pod \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.341929 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45llf\" (UniqueName: \"kubernetes.io/projected/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-kube-api-access-45llf\") pod \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\" (UID: \"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5\") " Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.343251 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-utilities" (OuterVolumeSpecName: "utilities") pod "ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" (UID: "ff1acde3-5e9f-4659-a5b5-88497d0ce4c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.352546 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-kube-api-access-45llf" (OuterVolumeSpecName: "kube-api-access-45llf") pod "ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" (UID: "ff1acde3-5e9f-4659-a5b5-88497d0ce4c5"). InnerVolumeSpecName "kube-api-access-45llf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.444601 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.444646 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45llf\" (UniqueName: \"kubernetes.io/projected/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-kube-api-access-45llf\") on node \"crc\" DevicePath \"\"" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.461109 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" (UID: "ff1acde3-5e9f-4659-a5b5-88497d0ce4c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.546650 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.869566 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5pg8" event={"ID":"ff1acde3-5e9f-4659-a5b5-88497d0ce4c5","Type":"ContainerDied","Data":"4880d61008d58f60627606eb86b9a7ff700103c4a3a917606afb4f4e5637a700"} Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.870416 4772 scope.go:117] "RemoveContainer" containerID="024773fe5b0145280bf522bfd231862b43616d6d3e7c272e497743d5b82823f5" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.869677 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5pg8" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.927254 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5pg8"] Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.938094 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r5pg8"] Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.943873 4772 scope.go:117] "RemoveContainer" containerID="be4fbbd1741965f3fcecf5b9a29d8a1a7d7296f88ab213fffd113439cfaa68b6" Jan 27 17:29:52 crc kubenswrapper[4772]: I0127 17:29:52.978577 4772 scope.go:117] "RemoveContainer" containerID="726824e909b709ea955048da1b47107c917a9ead2621a9e65cc30363053366cb" Jan 27 17:29:54 crc kubenswrapper[4772]: I0127 17:29:54.680538 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" path="/var/lib/kubelet/pods/ff1acde3-5e9f-4659-a5b5-88497d0ce4c5/volumes" Jan 27 17:29:59 crc kubenswrapper[4772]: I0127 17:29:59.664074 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:29:59 crc kubenswrapper[4772]: E0127 17:29:59.664838 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.156334 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54"] Jan 27 17:30:00 crc kubenswrapper[4772]: E0127 17:30:00.157507 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="extract-utilities" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.157551 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="extract-utilities" Jan 27 17:30:00 crc kubenswrapper[4772]: E0127 17:30:00.157573 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="extract-content" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.157589 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="extract-content" Jan 27 17:30:00 crc kubenswrapper[4772]: E0127 17:30:00.157611 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="registry-server" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.157624 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="registry-server" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.158028 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1acde3-5e9f-4659-a5b5-88497d0ce4c5" containerName="registry-server" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.159084 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.161075 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.161450 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.190659 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54"] Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.318971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4v8d\" (UniqueName: \"kubernetes.io/projected/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-kube-api-access-c4v8d\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.319239 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-config-volume\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.319433 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-secret-volume\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.422152 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4v8d\" (UniqueName: \"kubernetes.io/projected/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-kube-api-access-c4v8d\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.422321 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-config-volume\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.422394 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-secret-volume\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.424007 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-config-volume\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.885069 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-secret-volume\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:00 crc kubenswrapper[4772]: I0127 17:30:00.886289 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4v8d\" (UniqueName: \"kubernetes.io/projected/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-kube-api-access-c4v8d\") pod \"collect-profiles-29492250-lgh54\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:01 crc kubenswrapper[4772]: I0127 17:30:01.095578 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:01 crc kubenswrapper[4772]: I0127 17:30:01.611767 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54"] Jan 27 17:30:01 crc kubenswrapper[4772]: W0127 17:30:01.627003 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc02097b9_7eb9_4e2c_ad5e_5ec0983cb874.slice/crio-e0a9f131bb0864deac9499968e098890c0ec60bcb62ff0e22a1c29d01928a81e WatchSource:0}: Error finding container e0a9f131bb0864deac9499968e098890c0ec60bcb62ff0e22a1c29d01928a81e: Status 404 returned error can't find the container with id e0a9f131bb0864deac9499968e098890c0ec60bcb62ff0e22a1c29d01928a81e Jan 27 17:30:01 crc kubenswrapper[4772]: I0127 17:30:01.952719 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" event={"ID":"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874","Type":"ContainerStarted","Data":"2c0ec74fa28e0e4c9ec0bad9a7085dcc6ba408566f72f39040535b3c2d917984"} Jan 27 17:30:01 crc kubenswrapper[4772]: I0127 17:30:01.952765 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" event={"ID":"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874","Type":"ContainerStarted","Data":"e0a9f131bb0864deac9499968e098890c0ec60bcb62ff0e22a1c29d01928a81e"} Jan 27 17:30:01 crc kubenswrapper[4772]: I0127 17:30:01.974314 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" podStartSLOduration=1.974299113 podStartE2EDuration="1.974299113s" podCreationTimestamp="2026-01-27 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 17:30:01.974280782 +0000 UTC m=+8587.954889900" watchObservedRunningTime="2026-01-27 17:30:01.974299113 +0000 UTC m=+8587.954908211" Jan 27 17:30:02 crc kubenswrapper[4772]: I0127 17:30:02.970647 4772 generic.go:334] "Generic (PLEG): container finished" podID="c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" containerID="2c0ec74fa28e0e4c9ec0bad9a7085dcc6ba408566f72f39040535b3c2d917984" exitCode=0 Jan 27 17:30:02 crc kubenswrapper[4772]: I0127 17:30:02.970764 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" event={"ID":"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874","Type":"ContainerDied","Data":"2c0ec74fa28e0e4c9ec0bad9a7085dcc6ba408566f72f39040535b3c2d917984"} Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.334041 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.401316 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-secret-volume\") pod \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.401684 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-config-volume\") pod \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.401945 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4v8d\" (UniqueName: \"kubernetes.io/projected/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-kube-api-access-c4v8d\") pod \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\" (UID: \"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874\") " Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.402672 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-config-volume" (OuterVolumeSpecName: "config-volume") pod "c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" (UID: "c02097b9-7eb9-4e2c-ad5e-5ec0983cb874"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.403539 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.409900 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" (UID: "c02097b9-7eb9-4e2c-ad5e-5ec0983cb874"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.412497 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-kube-api-access-c4v8d" (OuterVolumeSpecName: "kube-api-access-c4v8d") pod "c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" (UID: "c02097b9-7eb9-4e2c-ad5e-5ec0983cb874"). InnerVolumeSpecName "kube-api-access-c4v8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.505606 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.505681 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4v8d\" (UniqueName: \"kubernetes.io/projected/c02097b9-7eb9-4e2c-ad5e-5ec0983cb874-kube-api-access-c4v8d\") on node \"crc\" DevicePath \"\"" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.687790 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l"] Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.694455 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492205-9hv9l"] Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.996745 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" event={"ID":"c02097b9-7eb9-4e2c-ad5e-5ec0983cb874","Type":"ContainerDied","Data":"e0a9f131bb0864deac9499968e098890c0ec60bcb62ff0e22a1c29d01928a81e"} Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.997127 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0a9f131bb0864deac9499968e098890c0ec60bcb62ff0e22a1c29d01928a81e" Jan 27 17:30:04 crc kubenswrapper[4772]: I0127 17:30:04.996816 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492250-lgh54" Jan 27 17:30:06 crc kubenswrapper[4772]: I0127 17:30:06.678640 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb399a66-1690-4026-9b2e-9e399d3270d2" path="/var/lib/kubelet/pods/eb399a66-1690-4026-9b2e-9e399d3270d2/volumes" Jan 27 17:30:11 crc kubenswrapper[4772]: I0127 17:30:11.158891 4772 scope.go:117] "RemoveContainer" containerID="3b5c3dfd99ca4b5982c3131c3d5ce465e41cbe4ff9774e156e9f425715410ede" Jan 27 17:30:11 crc kubenswrapper[4772]: I0127 17:30:11.663555 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:30:11 crc kubenswrapper[4772]: E0127 17:30:11.663900 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:30:22 crc kubenswrapper[4772]: I0127 17:30:22.663300 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:30:22 crc kubenswrapper[4772]: E0127 17:30:22.664218 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:30:34 crc kubenswrapper[4772]: I0127 17:30:34.678710 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:30:34 crc kubenswrapper[4772]: E0127 17:30:34.680161 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:30:48 crc kubenswrapper[4772]: I0127 17:30:48.663730 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:30:48 crc kubenswrapper[4772]: E0127 17:30:48.665052 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:31:00 crc kubenswrapper[4772]: I0127 17:31:00.664669 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:31:00 crc kubenswrapper[4772]: E0127 17:31:00.666013 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:31:15 crc kubenswrapper[4772]: I0127 17:31:15.662626 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:31:15 crc kubenswrapper[4772]: E0127 17:31:15.663485 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:31:26 crc kubenswrapper[4772]: I0127 17:31:26.663682 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:31:26 crc kubenswrapper[4772]: E0127 17:31:26.665636 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:31:40 crc kubenswrapper[4772]: I0127 17:31:40.665440 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:31:40 crc kubenswrapper[4772]: E0127 17:31:40.668057 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:31:55 crc kubenswrapper[4772]: I0127 17:31:55.663292 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:31:55 crc kubenswrapper[4772]: E0127 17:31:55.663981 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:32:10 crc kubenswrapper[4772]: I0127 17:32:10.663845 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:32:10 crc kubenswrapper[4772]: E0127 17:32:10.664835 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:32:23 crc kubenswrapper[4772]: I0127 17:32:23.663654 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:32:23 crc kubenswrapper[4772]: E0127 17:32:23.665468 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:32:34 crc kubenswrapper[4772]: I0127 17:32:34.684645 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:32:34 crc kubenswrapper[4772]: E0127 17:32:34.686202 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:32:47 crc kubenswrapper[4772]: I0127 17:32:47.678342 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:32:49 crc kubenswrapper[4772]: I0127 17:32:49.672459 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"7028bdbe5a39e55f1ad82b72de13a00af4f685c12d26eb6560d24b432826149d"} Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.627047 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x5jtw"] Jan 27 17:33:13 crc kubenswrapper[4772]: E0127 17:33:13.628346 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" containerName="collect-profiles" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.628364 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" containerName="collect-profiles" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.628651 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c02097b9-7eb9-4e2c-ad5e-5ec0983cb874" containerName="collect-profiles" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.630993 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.637719 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5jtw"] Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.769828 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-catalog-content\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.769942 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55llv\" (UniqueName: \"kubernetes.io/projected/0effe0d1-557e-4294-b4c7-71060dec32e0-kube-api-access-55llv\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.769994 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-utilities\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.871710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-catalog-content\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.872046 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55llv\" (UniqueName: \"kubernetes.io/projected/0effe0d1-557e-4294-b4c7-71060dec32e0-kube-api-access-55llv\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.872194 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-catalog-content\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.872316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-utilities\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:13 crc kubenswrapper[4772]: I0127 17:33:13.872545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-utilities\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:14 crc kubenswrapper[4772]: I0127 17:33:14.286147 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55llv\" (UniqueName: \"kubernetes.io/projected/0effe0d1-557e-4294-b4c7-71060dec32e0-kube-api-access-55llv\") pod \"community-operators-x5jtw\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:14 crc kubenswrapper[4772]: I0127 17:33:14.565664 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:15 crc kubenswrapper[4772]: I0127 17:33:15.139888 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x5jtw"] Jan 27 17:33:15 crc kubenswrapper[4772]: I0127 17:33:15.971751 4772 generic.go:334] "Generic (PLEG): container finished" podID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerID="d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976" exitCode=0 Jan 27 17:33:15 crc kubenswrapper[4772]: I0127 17:33:15.971968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5jtw" event={"ID":"0effe0d1-557e-4294-b4c7-71060dec32e0","Type":"ContainerDied","Data":"d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976"} Jan 27 17:33:15 crc kubenswrapper[4772]: I0127 17:33:15.972457 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5jtw" event={"ID":"0effe0d1-557e-4294-b4c7-71060dec32e0","Type":"ContainerStarted","Data":"d2736ac9699bef9db0c5e6e51ca8cd5f0baa3e5f558e7d1f0fcced8539a9c015"} Jan 27 17:33:15 crc kubenswrapper[4772]: I0127 17:33:15.974895 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:33:18 crc kubenswrapper[4772]: I0127 17:33:18.005143 4772 generic.go:334] "Generic (PLEG): container finished" podID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerID="e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e" exitCode=0 Jan 27 17:33:18 crc kubenswrapper[4772]: I0127 17:33:18.005213 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5jtw" event={"ID":"0effe0d1-557e-4294-b4c7-71060dec32e0","Type":"ContainerDied","Data":"e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e"} Jan 27 17:33:19 crc kubenswrapper[4772]: I0127 17:33:19.022500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5jtw" event={"ID":"0effe0d1-557e-4294-b4c7-71060dec32e0","Type":"ContainerStarted","Data":"bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc"} Jan 27 17:33:19 crc kubenswrapper[4772]: I0127 17:33:19.055721 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x5jtw" podStartSLOduration=3.5990815830000003 podStartE2EDuration="6.055703774s" podCreationTimestamp="2026-01-27 17:33:13 +0000 UTC" firstStartedPulling="2026-01-27 17:33:15.974334131 +0000 UTC m=+8781.954943259" lastFinishedPulling="2026-01-27 17:33:18.430956332 +0000 UTC m=+8784.411565450" observedRunningTime="2026-01-27 17:33:19.053473392 +0000 UTC m=+8785.034082490" watchObservedRunningTime="2026-01-27 17:33:19.055703774 +0000 UTC m=+8785.036312872" Jan 27 17:33:24 crc kubenswrapper[4772]: I0127 17:33:24.565858 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:24 crc kubenswrapper[4772]: I0127 17:33:24.566608 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:24 crc kubenswrapper[4772]: I0127 17:33:24.639837 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:25 crc kubenswrapper[4772]: I0127 17:33:25.167963 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:25 crc kubenswrapper[4772]: I0127 17:33:25.248049 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5jtw"] Jan 27 17:33:27 crc kubenswrapper[4772]: I0127 17:33:27.114050 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x5jtw" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="registry-server" containerID="cri-o://bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc" gracePeriod=2 Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.025738 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.123957 4772 generic.go:334] "Generic (PLEG): container finished" podID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerID="bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc" exitCode=0 Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.123993 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5jtw" event={"ID":"0effe0d1-557e-4294-b4c7-71060dec32e0","Type":"ContainerDied","Data":"bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc"} Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.124021 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x5jtw" event={"ID":"0effe0d1-557e-4294-b4c7-71060dec32e0","Type":"ContainerDied","Data":"d2736ac9699bef9db0c5e6e51ca8cd5f0baa3e5f558e7d1f0fcced8539a9c015"} Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.124036 4772 scope.go:117] "RemoveContainer" containerID="bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.124059 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x5jtw" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.144012 4772 scope.go:117] "RemoveContainer" containerID="e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.164217 4772 scope.go:117] "RemoveContainer" containerID="d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.193826 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-utilities\") pod \"0effe0d1-557e-4294-b4c7-71060dec32e0\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.193891 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55llv\" (UniqueName: \"kubernetes.io/projected/0effe0d1-557e-4294-b4c7-71060dec32e0-kube-api-access-55llv\") pod \"0effe0d1-557e-4294-b4c7-71060dec32e0\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.194064 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-catalog-content\") pod \"0effe0d1-557e-4294-b4c7-71060dec32e0\" (UID: \"0effe0d1-557e-4294-b4c7-71060dec32e0\") " Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.194975 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-utilities" (OuterVolumeSpecName: "utilities") pod "0effe0d1-557e-4294-b4c7-71060dec32e0" (UID: "0effe0d1-557e-4294-b4c7-71060dec32e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.200241 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0effe0d1-557e-4294-b4c7-71060dec32e0-kube-api-access-55llv" (OuterVolumeSpecName: "kube-api-access-55llv") pod "0effe0d1-557e-4294-b4c7-71060dec32e0" (UID: "0effe0d1-557e-4294-b4c7-71060dec32e0"). InnerVolumeSpecName "kube-api-access-55llv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.268382 4772 scope.go:117] "RemoveContainer" containerID="bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc" Jan 27 17:33:28 crc kubenswrapper[4772]: E0127 17:33:28.269211 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc\": container with ID starting with bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc not found: ID does not exist" containerID="bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.269281 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc"} err="failed to get container status \"bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc\": rpc error: code = NotFound desc = could not find container \"bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc\": container with ID starting with bf765e64a424bbf22c0d9aa03a33fb814fb2c400906dcca8835df8f59a67eddc not found: ID does not exist" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.269364 4772 scope.go:117] "RemoveContainer" containerID="e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e" Jan 27 17:33:28 crc kubenswrapper[4772]: E0127 17:33:28.270323 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e\": container with ID starting with e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e not found: ID does not exist" containerID="e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.270385 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e"} err="failed to get container status \"e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e\": rpc error: code = NotFound desc = could not find container \"e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e\": container with ID starting with e03d3a2bd2216c5cd5a13cc3d14af89ab95fccb13a8daba4c24389313380f97e not found: ID does not exist" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.270416 4772 scope.go:117] "RemoveContainer" containerID="d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976" Jan 27 17:33:28 crc kubenswrapper[4772]: E0127 17:33:28.271144 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976\": container with ID starting with d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976 not found: ID does not exist" containerID="d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.271219 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976"} err="failed to get container status \"d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976\": rpc error: code = NotFound desc = could not find container \"d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976\": container with ID starting with d553db1d54a8e5c7fa11aff0982e1c195dfbd54e62c92c4bd24f05551d70a976 not found: ID does not exist" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.296469 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.296506 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55llv\" (UniqueName: \"kubernetes.io/projected/0effe0d1-557e-4294-b4c7-71060dec32e0-kube-api-access-55llv\") on node \"crc\" DevicePath \"\"" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.521004 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0effe0d1-557e-4294-b4c7-71060dec32e0" (UID: "0effe0d1-557e-4294-b4c7-71060dec32e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.601439 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0effe0d1-557e-4294-b4c7-71060dec32e0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.757701 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x5jtw"] Jan 27 17:33:28 crc kubenswrapper[4772]: I0127 17:33:28.774372 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x5jtw"] Jan 27 17:33:30 crc kubenswrapper[4772]: I0127 17:33:30.675219 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" path="/var/lib/kubelet/pods/0effe0d1-557e-4294-b4c7-71060dec32e0/volumes" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.843387 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mnwvb"] Jan 27 17:33:35 crc kubenswrapper[4772]: E0127 17:33:35.846706 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="registry-server" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.846781 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="registry-server" Jan 27 17:33:35 crc kubenswrapper[4772]: E0127 17:33:35.846803 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="extract-utilities" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.846816 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="extract-utilities" Jan 27 17:33:35 crc kubenswrapper[4772]: E0127 17:33:35.847017 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="extract-content" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.847035 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="extract-content" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.847485 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0effe0d1-557e-4294-b4c7-71060dec32e0" containerName="registry-server" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.850776 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:35 crc kubenswrapper[4772]: I0127 17:33:35.874734 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnwvb"] Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.003805 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zpgm\" (UniqueName: \"kubernetes.io/projected/480c784a-4028-480a-9400-d0c5a26072aa-kube-api-access-4zpgm\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.003869 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-catalog-content\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.004062 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-utilities\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.105590 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zpgm\" (UniqueName: \"kubernetes.io/projected/480c784a-4028-480a-9400-d0c5a26072aa-kube-api-access-4zpgm\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.105690 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-catalog-content\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.105759 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-utilities\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.106287 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-utilities\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.106557 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-catalog-content\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.123850 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zpgm\" (UniqueName: \"kubernetes.io/projected/480c784a-4028-480a-9400-d0c5a26072aa-kube-api-access-4zpgm\") pod \"redhat-marketplace-mnwvb\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.178001 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:36 crc kubenswrapper[4772]: I0127 17:33:36.716893 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnwvb"] Jan 27 17:33:37 crc kubenswrapper[4772]: I0127 17:33:37.216609 4772 generic.go:334] "Generic (PLEG): container finished" podID="480c784a-4028-480a-9400-d0c5a26072aa" containerID="49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6" exitCode=0 Jan 27 17:33:37 crc kubenswrapper[4772]: I0127 17:33:37.216651 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnwvb" event={"ID":"480c784a-4028-480a-9400-d0c5a26072aa","Type":"ContainerDied","Data":"49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6"} Jan 27 17:33:37 crc kubenswrapper[4772]: I0127 17:33:37.216677 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnwvb" event={"ID":"480c784a-4028-480a-9400-d0c5a26072aa","Type":"ContainerStarted","Data":"64fe0e4178b2f7c828957a4dc7ec17e010d253740ac714a22f2b3416b207e566"} Jan 27 17:33:39 crc kubenswrapper[4772]: I0127 17:33:39.242262 4772 generic.go:334] "Generic (PLEG): container finished" podID="480c784a-4028-480a-9400-d0c5a26072aa" containerID="c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d" exitCode=0 Jan 27 17:33:39 crc kubenswrapper[4772]: I0127 17:33:39.242350 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnwvb" event={"ID":"480c784a-4028-480a-9400-d0c5a26072aa","Type":"ContainerDied","Data":"c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d"} Jan 27 17:33:40 crc kubenswrapper[4772]: I0127 17:33:40.254867 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnwvb" event={"ID":"480c784a-4028-480a-9400-d0c5a26072aa","Type":"ContainerStarted","Data":"39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72"} Jan 27 17:33:40 crc kubenswrapper[4772]: I0127 17:33:40.284002 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mnwvb" podStartSLOduration=2.833658369 podStartE2EDuration="5.283983716s" podCreationTimestamp="2026-01-27 17:33:35 +0000 UTC" firstStartedPulling="2026-01-27 17:33:37.219674427 +0000 UTC m=+8803.200283565" lastFinishedPulling="2026-01-27 17:33:39.669999794 +0000 UTC m=+8805.650608912" observedRunningTime="2026-01-27 17:33:40.278338029 +0000 UTC m=+8806.258947127" watchObservedRunningTime="2026-01-27 17:33:40.283983716 +0000 UTC m=+8806.264592814" Jan 27 17:33:46 crc kubenswrapper[4772]: I0127 17:33:46.178829 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:46 crc kubenswrapper[4772]: I0127 17:33:46.179671 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:46 crc kubenswrapper[4772]: I0127 17:33:46.253951 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:46 crc kubenswrapper[4772]: I0127 17:33:46.370924 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:46 crc kubenswrapper[4772]: I0127 17:33:46.505190 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnwvb"] Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.332164 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mnwvb" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="registry-server" containerID="cri-o://39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72" gracePeriod=2 Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.841828 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.927266 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-utilities\") pod \"480c784a-4028-480a-9400-d0c5a26072aa\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.927334 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-catalog-content\") pod \"480c784a-4028-480a-9400-d0c5a26072aa\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.927536 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zpgm\" (UniqueName: \"kubernetes.io/projected/480c784a-4028-480a-9400-d0c5a26072aa-kube-api-access-4zpgm\") pod \"480c784a-4028-480a-9400-d0c5a26072aa\" (UID: \"480c784a-4028-480a-9400-d0c5a26072aa\") " Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.928995 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-utilities" (OuterVolumeSpecName: "utilities") pod "480c784a-4028-480a-9400-d0c5a26072aa" (UID: "480c784a-4028-480a-9400-d0c5a26072aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.933646 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480c784a-4028-480a-9400-d0c5a26072aa-kube-api-access-4zpgm" (OuterVolumeSpecName: "kube-api-access-4zpgm") pod "480c784a-4028-480a-9400-d0c5a26072aa" (UID: "480c784a-4028-480a-9400-d0c5a26072aa"). InnerVolumeSpecName "kube-api-access-4zpgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:33:48 crc kubenswrapper[4772]: I0127 17:33:48.951234 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "480c784a-4028-480a-9400-d0c5a26072aa" (UID: "480c784a-4028-480a-9400-d0c5a26072aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.029588 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.029630 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480c784a-4028-480a-9400-d0c5a26072aa-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.029646 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zpgm\" (UniqueName: \"kubernetes.io/projected/480c784a-4028-480a-9400-d0c5a26072aa-kube-api-access-4zpgm\") on node \"crc\" DevicePath \"\"" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.347242 4772 generic.go:334] "Generic (PLEG): container finished" podID="480c784a-4028-480a-9400-d0c5a26072aa" containerID="39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72" exitCode=0 Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.347396 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnwvb" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.348495 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnwvb" event={"ID":"480c784a-4028-480a-9400-d0c5a26072aa","Type":"ContainerDied","Data":"39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72"} Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.348624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnwvb" event={"ID":"480c784a-4028-480a-9400-d0c5a26072aa","Type":"ContainerDied","Data":"64fe0e4178b2f7c828957a4dc7ec17e010d253740ac714a22f2b3416b207e566"} Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.348720 4772 scope.go:117] "RemoveContainer" containerID="39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.389030 4772 scope.go:117] "RemoveContainer" containerID="c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.406049 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnwvb"] Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.429927 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnwvb"] Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.434837 4772 scope.go:117] "RemoveContainer" containerID="49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.497135 4772 scope.go:117] "RemoveContainer" containerID="39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72" Jan 27 17:33:49 crc kubenswrapper[4772]: E0127 17:33:49.497677 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72\": container with ID starting with 39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72 not found: ID does not exist" containerID="39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.497718 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72"} err="failed to get container status \"39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72\": rpc error: code = NotFound desc = could not find container \"39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72\": container with ID starting with 39dbf45dc1eeb8a5861376f800ef838077a260ad19d02ba9cc20d0c42f53df72 not found: ID does not exist" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.497744 4772 scope.go:117] "RemoveContainer" containerID="c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d" Jan 27 17:33:49 crc kubenswrapper[4772]: E0127 17:33:49.498117 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d\": container with ID starting with c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d not found: ID does not exist" containerID="c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.498153 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d"} err="failed to get container status \"c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d\": rpc error: code = NotFound desc = could not find container \"c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d\": container with ID starting with c0832a14e25fd5e95433f847268a64854b319bd9eea88f84e16396ac7edd5e8d not found: ID does not exist" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.498207 4772 scope.go:117] "RemoveContainer" containerID="49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6" Jan 27 17:33:49 crc kubenswrapper[4772]: E0127 17:33:49.498680 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6\": container with ID starting with 49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6 not found: ID does not exist" containerID="49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6" Jan 27 17:33:49 crc kubenswrapper[4772]: I0127 17:33:49.498707 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6"} err="failed to get container status \"49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6\": rpc error: code = NotFound desc = could not find container \"49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6\": container with ID starting with 49101020f0d6ef70ba5d6fc50a8fcd66dcfad489ffd620338fb72a9aa2b0bab6 not found: ID does not exist" Jan 27 17:33:50 crc kubenswrapper[4772]: I0127 17:33:50.672822 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480c784a-4028-480a-9400-d0c5a26072aa" path="/var/lib/kubelet/pods/480c784a-4028-480a-9400-d0c5a26072aa/volumes" Jan 27 17:35:12 crc kubenswrapper[4772]: I0127 17:35:12.058437 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:35:12 crc kubenswrapper[4772]: I0127 17:35:12.059694 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:35:42 crc kubenswrapper[4772]: I0127 17:35:42.059005 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:35:42 crc kubenswrapper[4772]: I0127 17:35:42.059657 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.630735 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nz2jj"] Jan 27 17:35:51 crc kubenswrapper[4772]: E0127 17:35:51.634006 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="extract-content" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.634148 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="extract-content" Jan 27 17:35:51 crc kubenswrapper[4772]: E0127 17:35:51.634282 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="extract-utilities" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.634378 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="extract-utilities" Jan 27 17:35:51 crc kubenswrapper[4772]: E0127 17:35:51.634528 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="registry-server" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.634612 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="registry-server" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.635125 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="480c784a-4028-480a-9400-d0c5a26072aa" containerName="registry-server" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.637515 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.658542 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nz2jj"] Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.740321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-catalog-content\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.740387 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmtv8\" (UniqueName: \"kubernetes.io/projected/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-kube-api-access-wmtv8\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.741405 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-utilities\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.843016 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-utilities\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.843189 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-catalog-content\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.843216 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmtv8\" (UniqueName: \"kubernetes.io/projected/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-kube-api-access-wmtv8\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.843582 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-utilities\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.844088 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-catalog-content\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.869120 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmtv8\" (UniqueName: \"kubernetes.io/projected/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-kube-api-access-wmtv8\") pod \"certified-operators-nz2jj\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:51 crc kubenswrapper[4772]: I0127 17:35:51.999452 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:35:52 crc kubenswrapper[4772]: I0127 17:35:52.508840 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nz2jj"] Jan 27 17:35:52 crc kubenswrapper[4772]: I0127 17:35:52.560537 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerStarted","Data":"b49ab6bff4c5c8f14d6eb5920a5654de880c1aee811c325479fc5b2f59a6b28c"} Jan 27 17:35:53 crc kubenswrapper[4772]: I0127 17:35:53.576819 4772 generic.go:334] "Generic (PLEG): container finished" podID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerID="446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7" exitCode=0 Jan 27 17:35:53 crc kubenswrapper[4772]: I0127 17:35:53.576924 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerDied","Data":"446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7"} Jan 27 17:35:54 crc kubenswrapper[4772]: I0127 17:35:54.599708 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerStarted","Data":"96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d"} Jan 27 17:35:55 crc kubenswrapper[4772]: I0127 17:35:55.613070 4772 generic.go:334] "Generic (PLEG): container finished" podID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerID="96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d" exitCode=0 Jan 27 17:35:55 crc kubenswrapper[4772]: I0127 17:35:55.613110 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerDied","Data":"96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d"} Jan 27 17:35:56 crc kubenswrapper[4772]: I0127 17:35:56.623797 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerStarted","Data":"1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd"} Jan 27 17:35:56 crc kubenswrapper[4772]: I0127 17:35:56.652979 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nz2jj" podStartSLOduration=3.221901419 podStartE2EDuration="5.65295613s" podCreationTimestamp="2026-01-27 17:35:51 +0000 UTC" firstStartedPulling="2026-01-27 17:35:53.585809672 +0000 UTC m=+8939.566418770" lastFinishedPulling="2026-01-27 17:35:56.016864363 +0000 UTC m=+8941.997473481" observedRunningTime="2026-01-27 17:35:56.645337818 +0000 UTC m=+8942.625946946" watchObservedRunningTime="2026-01-27 17:35:56.65295613 +0000 UTC m=+8942.633565268" Jan 27 17:36:02 crc kubenswrapper[4772]: I0127 17:36:01.999633 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:36:02 crc kubenswrapper[4772]: I0127 17:36:02.001801 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:36:02 crc kubenswrapper[4772]: I0127 17:36:02.087301 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:36:02 crc kubenswrapper[4772]: I0127 17:36:02.762034 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:36:02 crc kubenswrapper[4772]: I0127 17:36:02.821351 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nz2jj"] Jan 27 17:36:04 crc kubenswrapper[4772]: I0127 17:36:04.721944 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nz2jj" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="registry-server" containerID="cri-o://1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd" gracePeriod=2 Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.195218 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.225638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-catalog-content\") pod \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.225725 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-utilities\") pod \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.225876 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmtv8\" (UniqueName: \"kubernetes.io/projected/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-kube-api-access-wmtv8\") pod \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\" (UID: \"3da4f7ee-931f-4606-b652-ca6bb3b36bcc\") " Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.228382 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-utilities" (OuterVolumeSpecName: "utilities") pod "3da4f7ee-931f-4606-b652-ca6bb3b36bcc" (UID: "3da4f7ee-931f-4606-b652-ca6bb3b36bcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.243088 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-kube-api-access-wmtv8" (OuterVolumeSpecName: "kube-api-access-wmtv8") pod "3da4f7ee-931f-4606-b652-ca6bb3b36bcc" (UID: "3da4f7ee-931f-4606-b652-ca6bb3b36bcc"). InnerVolumeSpecName "kube-api-access-wmtv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.332751 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.332804 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmtv8\" (UniqueName: \"kubernetes.io/projected/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-kube-api-access-wmtv8\") on node \"crc\" DevicePath \"\"" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.374419 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3da4f7ee-931f-4606-b652-ca6bb3b36bcc" (UID: "3da4f7ee-931f-4606-b652-ca6bb3b36bcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.435102 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3da4f7ee-931f-4606-b652-ca6bb3b36bcc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.734897 4772 generic.go:334] "Generic (PLEG): container finished" podID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerID="1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd" exitCode=0 Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.734967 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nz2jj" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.735001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerDied","Data":"1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd"} Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.735374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nz2jj" event={"ID":"3da4f7ee-931f-4606-b652-ca6bb3b36bcc","Type":"ContainerDied","Data":"b49ab6bff4c5c8f14d6eb5920a5654de880c1aee811c325479fc5b2f59a6b28c"} Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.735429 4772 scope.go:117] "RemoveContainer" containerID="1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.767767 4772 scope.go:117] "RemoveContainer" containerID="96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.807446 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nz2jj"] Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.815359 4772 scope.go:117] "RemoveContainer" containerID="446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.824157 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nz2jj"] Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.868491 4772 scope.go:117] "RemoveContainer" containerID="1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd" Jan 27 17:36:05 crc kubenswrapper[4772]: E0127 17:36:05.869125 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd\": container with ID starting with 1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd not found: ID does not exist" containerID="1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.869268 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd"} err="failed to get container status \"1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd\": rpc error: code = NotFound desc = could not find container \"1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd\": container with ID starting with 1b92529d945a8ea40be9f2fcb29ba3d51fc168d67d9414057934202d95b840dd not found: ID does not exist" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.869314 4772 scope.go:117] "RemoveContainer" containerID="96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d" Jan 27 17:36:05 crc kubenswrapper[4772]: E0127 17:36:05.869726 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d\": container with ID starting with 96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d not found: ID does not exist" containerID="96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.869771 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d"} err="failed to get container status \"96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d\": rpc error: code = NotFound desc = could not find container \"96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d\": container with ID starting with 96788f8c0400100015b3d653c4c77506acec40f39e8f9a95a39c10d64447317d not found: ID does not exist" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.869801 4772 scope.go:117] "RemoveContainer" containerID="446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7" Jan 27 17:36:05 crc kubenswrapper[4772]: E0127 17:36:05.870513 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7\": container with ID starting with 446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7 not found: ID does not exist" containerID="446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7" Jan 27 17:36:05 crc kubenswrapper[4772]: I0127 17:36:05.871197 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7"} err="failed to get container status \"446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7\": rpc error: code = NotFound desc = could not find container \"446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7\": container with ID starting with 446754c22019eb637f680b802348c7e55f93e4b8311ed203bf99915655bc42a7 not found: ID does not exist" Jan 27 17:36:06 crc kubenswrapper[4772]: I0127 17:36:06.685356 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" path="/var/lib/kubelet/pods/3da4f7ee-931f-4606-b652-ca6bb3b36bcc/volumes" Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.058092 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.058992 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.059077 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.060407 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7028bdbe5a39e55f1ad82b72de13a00af4f685c12d26eb6560d24b432826149d"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.060524 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://7028bdbe5a39e55f1ad82b72de13a00af4f685c12d26eb6560d24b432826149d" gracePeriod=600 Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.834714 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="7028bdbe5a39e55f1ad82b72de13a00af4f685c12d26eb6560d24b432826149d" exitCode=0 Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.834943 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"7028bdbe5a39e55f1ad82b72de13a00af4f685c12d26eb6560d24b432826149d"} Jan 27 17:36:12 crc kubenswrapper[4772]: I0127 17:36:12.835273 4772 scope.go:117] "RemoveContainer" containerID="af2493a7ca3ac75fd4192b599bed251fe8ce4cca24c4715f3493ee05b00e8284" Jan 27 17:36:13 crc kubenswrapper[4772]: I0127 17:36:13.847909 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d"} Jan 27 17:38:42 crc kubenswrapper[4772]: I0127 17:38:42.058056 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:38:42 crc kubenswrapper[4772]: I0127 17:38:42.058578 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:39:12 crc kubenswrapper[4772]: I0127 17:39:12.058693 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:39:12 crc kubenswrapper[4772]: I0127 17:39:12.059564 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.058639 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.059332 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.059392 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.060443 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.060572 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" gracePeriod=600 Jan 27 17:39:42 crc kubenswrapper[4772]: E0127 17:39:42.188744 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.355943 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" exitCode=0 Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.356044 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d"} Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.356307 4772 scope.go:117] "RemoveContainer" containerID="7028bdbe5a39e55f1ad82b72de13a00af4f685c12d26eb6560d24b432826149d" Jan 27 17:39:42 crc kubenswrapper[4772]: I0127 17:39:42.357220 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:39:42 crc kubenswrapper[4772]: E0127 17:39:42.357714 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:39:53 crc kubenswrapper[4772]: I0127 17:39:53.663742 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:39:53 crc kubenswrapper[4772]: E0127 17:39:53.664816 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:40:04 crc kubenswrapper[4772]: I0127 17:40:04.676956 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:40:04 crc kubenswrapper[4772]: E0127 17:40:04.678277 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.697641 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6f7k5"] Jan 27 17:40:16 crc kubenswrapper[4772]: E0127 17:40:16.698948 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="extract-utilities" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.698974 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="extract-utilities" Jan 27 17:40:16 crc kubenswrapper[4772]: E0127 17:40:16.698991 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="extract-content" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.699002 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="extract-content" Jan 27 17:40:16 crc kubenswrapper[4772]: E0127 17:40:16.699016 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="registry-server" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.699026 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="registry-server" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.699351 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da4f7ee-931f-4606-b652-ca6bb3b36bcc" containerName="registry-server" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.701334 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.710385 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6f7k5"] Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.790676 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fjnv\" (UniqueName: \"kubernetes.io/projected/a3b36acb-1d5b-4384-9090-ce95e3d89a21-kube-api-access-7fjnv\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.790772 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-catalog-content\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.791476 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-utilities\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.893898 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-utilities\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.894211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fjnv\" (UniqueName: \"kubernetes.io/projected/a3b36acb-1d5b-4384-9090-ce95e3d89a21-kube-api-access-7fjnv\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.894393 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-catalog-content\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.894703 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-utilities\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.894909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-catalog-content\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:16 crc kubenswrapper[4772]: I0127 17:40:16.991064 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fjnv\" (UniqueName: \"kubernetes.io/projected/a3b36acb-1d5b-4384-9090-ce95e3d89a21-kube-api-access-7fjnv\") pod \"redhat-operators-6f7k5\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:17 crc kubenswrapper[4772]: I0127 17:40:17.044895 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:17 crc kubenswrapper[4772]: I0127 17:40:17.500637 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6f7k5"] Jan 27 17:40:17 crc kubenswrapper[4772]: I0127 17:40:17.741242 4772 generic.go:334] "Generic (PLEG): container finished" podID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerID="b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419" exitCode=0 Jan 27 17:40:17 crc kubenswrapper[4772]: I0127 17:40:17.741334 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerDied","Data":"b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419"} Jan 27 17:40:17 crc kubenswrapper[4772]: I0127 17:40:17.741579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerStarted","Data":"ba22c7075b7ea96ca1087e7fff4b1cec7517736240e86b21b900c9d7c35d0e36"} Jan 27 17:40:17 crc kubenswrapper[4772]: I0127 17:40:17.742934 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:40:18 crc kubenswrapper[4772]: I0127 17:40:18.665609 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:40:18 crc kubenswrapper[4772]: E0127 17:40:18.666753 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:40:18 crc kubenswrapper[4772]: I0127 17:40:18.756903 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerStarted","Data":"5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb"} Jan 27 17:40:21 crc kubenswrapper[4772]: I0127 17:40:21.792555 4772 generic.go:334] "Generic (PLEG): container finished" podID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerID="5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb" exitCode=0 Jan 27 17:40:21 crc kubenswrapper[4772]: I0127 17:40:21.792651 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerDied","Data":"5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb"} Jan 27 17:40:22 crc kubenswrapper[4772]: I0127 17:40:22.803295 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerStarted","Data":"593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38"} Jan 27 17:40:22 crc kubenswrapper[4772]: I0127 17:40:22.829099 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6f7k5" podStartSLOduration=2.30730743 podStartE2EDuration="6.829077118s" podCreationTimestamp="2026-01-27 17:40:16 +0000 UTC" firstStartedPulling="2026-01-27 17:40:17.742727833 +0000 UTC m=+9203.723336931" lastFinishedPulling="2026-01-27 17:40:22.264497521 +0000 UTC m=+9208.245106619" observedRunningTime="2026-01-27 17:40:22.825823005 +0000 UTC m=+9208.806432113" watchObservedRunningTime="2026-01-27 17:40:22.829077118 +0000 UTC m=+9208.809686236" Jan 27 17:40:27 crc kubenswrapper[4772]: I0127 17:40:27.046144 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:27 crc kubenswrapper[4772]: I0127 17:40:27.047094 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:28 crc kubenswrapper[4772]: I0127 17:40:28.123487 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6f7k5" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="registry-server" probeResult="failure" output=< Jan 27 17:40:28 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 17:40:28 crc kubenswrapper[4772]: > Jan 27 17:40:30 crc kubenswrapper[4772]: I0127 17:40:30.664005 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:40:30 crc kubenswrapper[4772]: E0127 17:40:30.665036 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:40:37 crc kubenswrapper[4772]: I0127 17:40:37.122129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:37 crc kubenswrapper[4772]: I0127 17:40:37.208639 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:37 crc kubenswrapper[4772]: I0127 17:40:37.380069 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6f7k5"] Jan 27 17:40:38 crc kubenswrapper[4772]: I0127 17:40:38.979950 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6f7k5" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="registry-server" containerID="cri-o://593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38" gracePeriod=2 Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.443959 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.473606 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-utilities\") pod \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.473815 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-catalog-content\") pod \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.474011 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fjnv\" (UniqueName: \"kubernetes.io/projected/a3b36acb-1d5b-4384-9090-ce95e3d89a21-kube-api-access-7fjnv\") pod \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\" (UID: \"a3b36acb-1d5b-4384-9090-ce95e3d89a21\") " Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.474557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-utilities" (OuterVolumeSpecName: "utilities") pod "a3b36acb-1d5b-4384-9090-ce95e3d89a21" (UID: "a3b36acb-1d5b-4384-9090-ce95e3d89a21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.475626 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.482084 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b36acb-1d5b-4384-9090-ce95e3d89a21-kube-api-access-7fjnv" (OuterVolumeSpecName: "kube-api-access-7fjnv") pod "a3b36acb-1d5b-4384-9090-ce95e3d89a21" (UID: "a3b36acb-1d5b-4384-9090-ce95e3d89a21"). InnerVolumeSpecName "kube-api-access-7fjnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.578321 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fjnv\" (UniqueName: \"kubernetes.io/projected/a3b36acb-1d5b-4384-9090-ce95e3d89a21-kube-api-access-7fjnv\") on node \"crc\" DevicePath \"\"" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.618042 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3b36acb-1d5b-4384-9090-ce95e3d89a21" (UID: "a3b36acb-1d5b-4384-9090-ce95e3d89a21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.680110 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b36acb-1d5b-4384-9090-ce95e3d89a21-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.991334 4772 generic.go:334] "Generic (PLEG): container finished" podID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerID="593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38" exitCode=0 Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.991388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerDied","Data":"593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38"} Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.991444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6f7k5" event={"ID":"a3b36acb-1d5b-4384-9090-ce95e3d89a21","Type":"ContainerDied","Data":"ba22c7075b7ea96ca1087e7fff4b1cec7517736240e86b21b900c9d7c35d0e36"} Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.991449 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6f7k5" Jan 27 17:40:39 crc kubenswrapper[4772]: I0127 17:40:39.991467 4772 scope.go:117] "RemoveContainer" containerID="593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.013314 4772 scope.go:117] "RemoveContainer" containerID="5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.032349 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6f7k5"] Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.042130 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6f7k5"] Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.051867 4772 scope.go:117] "RemoveContainer" containerID="b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.078152 4772 scope.go:117] "RemoveContainer" containerID="593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38" Jan 27 17:40:40 crc kubenswrapper[4772]: E0127 17:40:40.078884 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38\": container with ID starting with 593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38 not found: ID does not exist" containerID="593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.078950 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38"} err="failed to get container status \"593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38\": rpc error: code = NotFound desc = could not find container \"593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38\": container with ID starting with 593d19ce0cdbbf6fdbd0b96cef5cb458224a327825a61f25f1b4182c27097b38 not found: ID does not exist" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.078994 4772 scope.go:117] "RemoveContainer" containerID="5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb" Jan 27 17:40:40 crc kubenswrapper[4772]: E0127 17:40:40.079446 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb\": container with ID starting with 5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb not found: ID does not exist" containerID="5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.079509 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb"} err="failed to get container status \"5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb\": rpc error: code = NotFound desc = could not find container \"5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb\": container with ID starting with 5aea5fa5756b456fa46058d90e0286833506c245fcf9cb1a7c973fe4db57abeb not found: ID does not exist" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.079549 4772 scope.go:117] "RemoveContainer" containerID="b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419" Jan 27 17:40:40 crc kubenswrapper[4772]: E0127 17:40:40.079895 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419\": container with ID starting with b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419 not found: ID does not exist" containerID="b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.079944 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419"} err="failed to get container status \"b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419\": rpc error: code = NotFound desc = could not find container \"b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419\": container with ID starting with b09d5a873a7a863058f1edfb234292bbee9181b432c9d473d092af477ae78419 not found: ID does not exist" Jan 27 17:40:40 crc kubenswrapper[4772]: I0127 17:40:40.678012 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" path="/var/lib/kubelet/pods/a3b36acb-1d5b-4384-9090-ce95e3d89a21/volumes" Jan 27 17:40:41 crc kubenswrapper[4772]: I0127 17:40:41.663215 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:40:41 crc kubenswrapper[4772]: E0127 17:40:41.663591 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:40:56 crc kubenswrapper[4772]: I0127 17:40:56.663301 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:40:56 crc kubenswrapper[4772]: E0127 17:40:56.664722 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:41:11 crc kubenswrapper[4772]: I0127 17:41:11.664146 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:41:11 crc kubenswrapper[4772]: E0127 17:41:11.665273 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:41:23 crc kubenswrapper[4772]: I0127 17:41:23.664389 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:41:23 crc kubenswrapper[4772]: E0127 17:41:23.665290 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:41:37 crc kubenswrapper[4772]: I0127 17:41:37.668790 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:41:37 crc kubenswrapper[4772]: E0127 17:41:37.670323 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:41:50 crc kubenswrapper[4772]: I0127 17:41:50.662881 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:41:50 crc kubenswrapper[4772]: E0127 17:41:50.663790 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:42:04 crc kubenswrapper[4772]: I0127 17:42:04.671320 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:42:04 crc kubenswrapper[4772]: E0127 17:42:04.672107 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:42:18 crc kubenswrapper[4772]: I0127 17:42:18.663489 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:42:18 crc kubenswrapper[4772]: E0127 17:42:18.664607 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:42:29 crc kubenswrapper[4772]: I0127 17:42:29.663530 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:42:29 crc kubenswrapper[4772]: E0127 17:42:29.664287 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:42:41 crc kubenswrapper[4772]: I0127 17:42:41.662865 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:42:41 crc kubenswrapper[4772]: E0127 17:42:41.663706 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:42:52 crc kubenswrapper[4772]: I0127 17:42:52.663919 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:42:52 crc kubenswrapper[4772]: E0127 17:42:52.664813 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:43:04 crc kubenswrapper[4772]: I0127 17:43:04.676030 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:43:04 crc kubenswrapper[4772]: E0127 17:43:04.677673 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:43:16 crc kubenswrapper[4772]: I0127 17:43:16.664549 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:43:16 crc kubenswrapper[4772]: E0127 17:43:16.665814 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.488796 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5kdsn"] Jan 27 17:43:17 crc kubenswrapper[4772]: E0127 17:43:17.489608 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="extract-content" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.489636 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="extract-content" Jan 27 17:43:17 crc kubenswrapper[4772]: E0127 17:43:17.489656 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="registry-server" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.489664 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="registry-server" Jan 27 17:43:17 crc kubenswrapper[4772]: E0127 17:43:17.489701 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="extract-utilities" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.489710 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="extract-utilities" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.489947 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b36acb-1d5b-4384-9090-ce95e3d89a21" containerName="registry-server" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.491532 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.510540 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kdsn"] Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.647343 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-catalog-content\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.647487 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqn9m\" (UniqueName: \"kubernetes.io/projected/0c6ade8d-638e-415e-8363-af0aa99994b2-kube-api-access-kqn9m\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.649274 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-utilities\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.751455 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-utilities\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.751551 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-catalog-content\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.751579 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqn9m\" (UniqueName: \"kubernetes.io/projected/0c6ade8d-638e-415e-8363-af0aa99994b2-kube-api-access-kqn9m\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.752146 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-utilities\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:17 crc kubenswrapper[4772]: I0127 17:43:17.752220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-catalog-content\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:18 crc kubenswrapper[4772]: I0127 17:43:18.191877 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqn9m\" (UniqueName: \"kubernetes.io/projected/0c6ade8d-638e-415e-8363-af0aa99994b2-kube-api-access-kqn9m\") pod \"community-operators-5kdsn\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:18 crc kubenswrapper[4772]: I0127 17:43:18.416056 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:18 crc kubenswrapper[4772]: I0127 17:43:18.862457 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kdsn"] Jan 27 17:43:19 crc kubenswrapper[4772]: I0127 17:43:19.629057 4772 generic.go:334] "Generic (PLEG): container finished" podID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerID="1de1923bc262c5aaefa8cf7b56877449ea107eda643f14cf3ab05e1b3236c906" exitCode=0 Jan 27 17:43:19 crc kubenswrapper[4772]: I0127 17:43:19.629357 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kdsn" event={"ID":"0c6ade8d-638e-415e-8363-af0aa99994b2","Type":"ContainerDied","Data":"1de1923bc262c5aaefa8cf7b56877449ea107eda643f14cf3ab05e1b3236c906"} Jan 27 17:43:19 crc kubenswrapper[4772]: I0127 17:43:19.629387 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kdsn" event={"ID":"0c6ade8d-638e-415e-8363-af0aa99994b2","Type":"ContainerStarted","Data":"bd47383e58f847ae39fb7fb3c9b02b7da360d7ebf0041c54c2a55ffc5d783006"} Jan 27 17:43:21 crc kubenswrapper[4772]: I0127 17:43:21.665417 4772 generic.go:334] "Generic (PLEG): container finished" podID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerID="d61e1ffdf7b107647ae33c5ea2c7619bc98bdd510e69bf094fc6877e8690c610" exitCode=0 Jan 27 17:43:21 crc kubenswrapper[4772]: I0127 17:43:21.665506 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kdsn" event={"ID":"0c6ade8d-638e-415e-8363-af0aa99994b2","Type":"ContainerDied","Data":"d61e1ffdf7b107647ae33c5ea2c7619bc98bdd510e69bf094fc6877e8690c610"} Jan 27 17:43:22 crc kubenswrapper[4772]: I0127 17:43:22.696870 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kdsn" event={"ID":"0c6ade8d-638e-415e-8363-af0aa99994b2","Type":"ContainerStarted","Data":"44933a5bde23ef59092f591e6e7a85ecc728ca650a6bece714d8a8b7d09229e6"} Jan 27 17:43:22 crc kubenswrapper[4772]: I0127 17:43:22.717435 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5kdsn" podStartSLOduration=3.276583875 podStartE2EDuration="5.717418478s" podCreationTimestamp="2026-01-27 17:43:17 +0000 UTC" firstStartedPulling="2026-01-27 17:43:19.630918082 +0000 UTC m=+9385.611527190" lastFinishedPulling="2026-01-27 17:43:22.071752675 +0000 UTC m=+9388.052361793" observedRunningTime="2026-01-27 17:43:22.717034197 +0000 UTC m=+9388.697643315" watchObservedRunningTime="2026-01-27 17:43:22.717418478 +0000 UTC m=+9388.698027576" Jan 27 17:43:28 crc kubenswrapper[4772]: I0127 17:43:28.416391 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:28 crc kubenswrapper[4772]: I0127 17:43:28.416969 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:28 crc kubenswrapper[4772]: I0127 17:43:28.471118 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:28 crc kubenswrapper[4772]: I0127 17:43:28.663607 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:43:28 crc kubenswrapper[4772]: E0127 17:43:28.664217 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:43:28 crc kubenswrapper[4772]: I0127 17:43:28.880796 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:28 crc kubenswrapper[4772]: I0127 17:43:28.948536 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kdsn"] Jan 27 17:43:30 crc kubenswrapper[4772]: I0127 17:43:30.816641 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5kdsn" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="registry-server" containerID="cri-o://44933a5bde23ef59092f591e6e7a85ecc728ca650a6bece714d8a8b7d09229e6" gracePeriod=2 Jan 27 17:43:31 crc kubenswrapper[4772]: I0127 17:43:31.829138 4772 generic.go:334] "Generic (PLEG): container finished" podID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerID="44933a5bde23ef59092f591e6e7a85ecc728ca650a6bece714d8a8b7d09229e6" exitCode=0 Jan 27 17:43:31 crc kubenswrapper[4772]: I0127 17:43:31.829215 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kdsn" event={"ID":"0c6ade8d-638e-415e-8363-af0aa99994b2","Type":"ContainerDied","Data":"44933a5bde23ef59092f591e6e7a85ecc728ca650a6bece714d8a8b7d09229e6"} Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.202804 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.354108 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-catalog-content\") pod \"0c6ade8d-638e-415e-8363-af0aa99994b2\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.354212 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqn9m\" (UniqueName: \"kubernetes.io/projected/0c6ade8d-638e-415e-8363-af0aa99994b2-kube-api-access-kqn9m\") pod \"0c6ade8d-638e-415e-8363-af0aa99994b2\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.354597 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-utilities\") pod \"0c6ade8d-638e-415e-8363-af0aa99994b2\" (UID: \"0c6ade8d-638e-415e-8363-af0aa99994b2\") " Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.356544 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-utilities" (OuterVolumeSpecName: "utilities") pod "0c6ade8d-638e-415e-8363-af0aa99994b2" (UID: "0c6ade8d-638e-415e-8363-af0aa99994b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.361850 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6ade8d-638e-415e-8363-af0aa99994b2-kube-api-access-kqn9m" (OuterVolumeSpecName: "kube-api-access-kqn9m") pod "0c6ade8d-638e-415e-8363-af0aa99994b2" (UID: "0c6ade8d-638e-415e-8363-af0aa99994b2"). InnerVolumeSpecName "kube-api-access-kqn9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.402031 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c6ade8d-638e-415e-8363-af0aa99994b2" (UID: "0c6ade8d-638e-415e-8363-af0aa99994b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.456750 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.456797 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqn9m\" (UniqueName: \"kubernetes.io/projected/0c6ade8d-638e-415e-8363-af0aa99994b2-kube-api-access-kqn9m\") on node \"crc\" DevicePath \"\"" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.456812 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6ade8d-638e-415e-8363-af0aa99994b2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.844757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kdsn" event={"ID":"0c6ade8d-638e-415e-8363-af0aa99994b2","Type":"ContainerDied","Data":"bd47383e58f847ae39fb7fb3c9b02b7da360d7ebf0041c54c2a55ffc5d783006"} Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.844905 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kdsn" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.846251 4772 scope.go:117] "RemoveContainer" containerID="44933a5bde23ef59092f591e6e7a85ecc728ca650a6bece714d8a8b7d09229e6" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.885818 4772 scope.go:117] "RemoveContainer" containerID="d61e1ffdf7b107647ae33c5ea2c7619bc98bdd510e69bf094fc6877e8690c610" Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.901576 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5kdsn"] Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.915579 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5kdsn"] Jan 27 17:43:32 crc kubenswrapper[4772]: I0127 17:43:32.928593 4772 scope.go:117] "RemoveContainer" containerID="1de1923bc262c5aaefa8cf7b56877449ea107eda643f14cf3ab05e1b3236c906" Jan 27 17:43:34 crc kubenswrapper[4772]: I0127 17:43:34.682685 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" path="/var/lib/kubelet/pods/0c6ade8d-638e-415e-8363-af0aa99994b2/volumes" Jan 27 17:43:41 crc kubenswrapper[4772]: I0127 17:43:41.663261 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:43:41 crc kubenswrapper[4772]: E0127 17:43:41.664522 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:43:52 crc kubenswrapper[4772]: I0127 17:43:52.664204 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:43:52 crc kubenswrapper[4772]: E0127 17:43:52.665009 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:44:04 crc kubenswrapper[4772]: I0127 17:44:04.674700 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:44:04 crc kubenswrapper[4772]: E0127 17:44:04.675972 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.485025 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ztd7g"] Jan 27 17:44:16 crc kubenswrapper[4772]: E0127 17:44:16.486118 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="extract-utilities" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.486134 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="extract-utilities" Jan 27 17:44:16 crc kubenswrapper[4772]: E0127 17:44:16.486150 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="extract-content" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.486158 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="extract-content" Jan 27 17:44:16 crc kubenswrapper[4772]: E0127 17:44:16.486193 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="registry-server" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.486202 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="registry-server" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.486422 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6ade8d-638e-415e-8363-af0aa99994b2" containerName="registry-server" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.488074 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.505967 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztd7g"] Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.610180 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx5j7\" (UniqueName: \"kubernetes.io/projected/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-kube-api-access-nx5j7\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.610451 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-utilities\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.610673 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-catalog-content\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.712093 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx5j7\" (UniqueName: \"kubernetes.io/projected/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-kube-api-access-nx5j7\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.712143 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-utilities\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.712259 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-catalog-content\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.712844 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-catalog-content\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.712860 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-utilities\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.735125 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx5j7\" (UniqueName: \"kubernetes.io/projected/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-kube-api-access-nx5j7\") pod \"redhat-marketplace-ztd7g\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:16 crc kubenswrapper[4772]: I0127 17:44:16.877942 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:17 crc kubenswrapper[4772]: I0127 17:44:17.355122 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztd7g"] Jan 27 17:44:17 crc kubenswrapper[4772]: I0127 17:44:17.663060 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:44:17 crc kubenswrapper[4772]: E0127 17:44:17.663408 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:44:18 crc kubenswrapper[4772]: I0127 17:44:18.374826 4772 generic.go:334] "Generic (PLEG): container finished" podID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerID="b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a" exitCode=0 Jan 27 17:44:18 crc kubenswrapper[4772]: I0127 17:44:18.374875 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerDied","Data":"b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a"} Jan 27 17:44:18 crc kubenswrapper[4772]: I0127 17:44:18.375153 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerStarted","Data":"5dcb1a2066c81ec8078b97d12fda7371f769744554e102f8bcf84749d10bd1d9"} Jan 27 17:44:19 crc kubenswrapper[4772]: I0127 17:44:19.386992 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerStarted","Data":"e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f"} Jan 27 17:44:20 crc kubenswrapper[4772]: I0127 17:44:20.402612 4772 generic.go:334] "Generic (PLEG): container finished" podID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerID="e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f" exitCode=0 Jan 27 17:44:20 crc kubenswrapper[4772]: I0127 17:44:20.402665 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerDied","Data":"e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f"} Jan 27 17:44:21 crc kubenswrapper[4772]: I0127 17:44:21.412065 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerStarted","Data":"4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507"} Jan 27 17:44:21 crc kubenswrapper[4772]: I0127 17:44:21.434330 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ztd7g" podStartSLOduration=3.00399019 podStartE2EDuration="5.434311503s" podCreationTimestamp="2026-01-27 17:44:16 +0000 UTC" firstStartedPulling="2026-01-27 17:44:18.37700394 +0000 UTC m=+9444.357613038" lastFinishedPulling="2026-01-27 17:44:20.807325263 +0000 UTC m=+9446.787934351" observedRunningTime="2026-01-27 17:44:21.427944581 +0000 UTC m=+9447.408553709" watchObservedRunningTime="2026-01-27 17:44:21.434311503 +0000 UTC m=+9447.414920611" Jan 27 17:44:26 crc kubenswrapper[4772]: I0127 17:44:26.878950 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:26 crc kubenswrapper[4772]: I0127 17:44:26.879763 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:26 crc kubenswrapper[4772]: I0127 17:44:26.941019 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:27 crc kubenswrapper[4772]: I0127 17:44:27.539939 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:27 crc kubenswrapper[4772]: I0127 17:44:27.609510 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztd7g"] Jan 27 17:44:29 crc kubenswrapper[4772]: I0127 17:44:29.512043 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ztd7g" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="registry-server" containerID="cri-o://4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507" gracePeriod=2 Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.012900 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.118416 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx5j7\" (UniqueName: \"kubernetes.io/projected/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-kube-api-access-nx5j7\") pod \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.118984 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-utilities\") pod \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.119061 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-catalog-content\") pod \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\" (UID: \"cb56c62f-1902-42bd-9f7a-3f0e20c01d88\") " Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.120033 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-utilities" (OuterVolumeSpecName: "utilities") pod "cb56c62f-1902-42bd-9f7a-3f0e20c01d88" (UID: "cb56c62f-1902-42bd-9f7a-3f0e20c01d88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.139273 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-kube-api-access-nx5j7" (OuterVolumeSpecName: "kube-api-access-nx5j7") pod "cb56c62f-1902-42bd-9f7a-3f0e20c01d88" (UID: "cb56c62f-1902-42bd-9f7a-3f0e20c01d88"). InnerVolumeSpecName "kube-api-access-nx5j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.144537 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb56c62f-1902-42bd-9f7a-3f0e20c01d88" (UID: "cb56c62f-1902-42bd-9f7a-3f0e20c01d88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.222618 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.222705 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.222738 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx5j7\" (UniqueName: \"kubernetes.io/projected/cb56c62f-1902-42bd-9f7a-3f0e20c01d88-kube-api-access-nx5j7\") on node \"crc\" DevicePath \"\"" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.527916 4772 generic.go:334] "Generic (PLEG): container finished" podID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerID="4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507" exitCode=0 Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.527983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerDied","Data":"4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507"} Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.528028 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ztd7g" event={"ID":"cb56c62f-1902-42bd-9f7a-3f0e20c01d88","Type":"ContainerDied","Data":"5dcb1a2066c81ec8078b97d12fda7371f769744554e102f8bcf84749d10bd1d9"} Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.528037 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ztd7g" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.528079 4772 scope.go:117] "RemoveContainer" containerID="4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.571560 4772 scope.go:117] "RemoveContainer" containerID="e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.600050 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztd7g"] Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.610332 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ztd7g"] Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.617946 4772 scope.go:117] "RemoveContainer" containerID="b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.648912 4772 scope.go:117] "RemoveContainer" containerID="4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507" Jan 27 17:44:30 crc kubenswrapper[4772]: E0127 17:44:30.649584 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507\": container with ID starting with 4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507 not found: ID does not exist" containerID="4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.649628 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507"} err="failed to get container status \"4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507\": rpc error: code = NotFound desc = could not find container \"4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507\": container with ID starting with 4cbc758915cd523a83bffdd5d4476bbc8eb7f635dfad679b235f04fb5cbef507 not found: ID does not exist" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.649653 4772 scope.go:117] "RemoveContainer" containerID="e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f" Jan 27 17:44:30 crc kubenswrapper[4772]: E0127 17:44:30.649973 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f\": container with ID starting with e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f not found: ID does not exist" containerID="e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.650000 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f"} err="failed to get container status \"e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f\": rpc error: code = NotFound desc = could not find container \"e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f\": container with ID starting with e25227c326a2c65eb0e09fe16b317148fadaabfa5b8ca0ec4137d03294784b1f not found: ID does not exist" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.650020 4772 scope.go:117] "RemoveContainer" containerID="b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a" Jan 27 17:44:30 crc kubenswrapper[4772]: E0127 17:44:30.650442 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a\": container with ID starting with b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a not found: ID does not exist" containerID="b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.650563 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a"} err="failed to get container status \"b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a\": rpc error: code = NotFound desc = could not find container \"b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a\": container with ID starting with b9cc131aafb6b8b3f0d0b0358d7d469d28588ef561bafb8cf718daf3bd6b115a not found: ID does not exist" Jan 27 17:44:30 crc kubenswrapper[4772]: I0127 17:44:30.675902 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" path="/var/lib/kubelet/pods/cb56c62f-1902-42bd-9f7a-3f0e20c01d88/volumes" Jan 27 17:44:31 crc kubenswrapper[4772]: I0127 17:44:31.662820 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:44:31 crc kubenswrapper[4772]: E0127 17:44:31.663678 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:44:42 crc kubenswrapper[4772]: I0127 17:44:42.663697 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:44:43 crc kubenswrapper[4772]: I0127 17:44:43.678627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"c95b967b4844ba74a00799daa1c360319d22d3be7e94ef067fa431b967e2d966"} Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.146239 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq"] Jan 27 17:45:00 crc kubenswrapper[4772]: E0127 17:45:00.147465 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="extract-content" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.147491 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="extract-content" Jan 27 17:45:00 crc kubenswrapper[4772]: E0127 17:45:00.147518 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="extract-utilities" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.147529 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="extract-utilities" Jan 27 17:45:00 crc kubenswrapper[4772]: E0127 17:45:00.147577 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="registry-server" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.147589 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="registry-server" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.147934 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb56c62f-1902-42bd-9f7a-3f0e20c01d88" containerName="registry-server" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.148999 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.152736 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.153041 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.158966 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq"] Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.232646 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c190402b-ed4d-48d2-984a-6ae0c4457911-config-volume\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.232836 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgdf\" (UniqueName: \"kubernetes.io/projected/c190402b-ed4d-48d2-984a-6ae0c4457911-kube-api-access-zzgdf\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.232930 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c190402b-ed4d-48d2-984a-6ae0c4457911-secret-volume\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.333853 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgdf\" (UniqueName: \"kubernetes.io/projected/c190402b-ed4d-48d2-984a-6ae0c4457911-kube-api-access-zzgdf\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.333950 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c190402b-ed4d-48d2-984a-6ae0c4457911-secret-volume\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.334010 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c190402b-ed4d-48d2-984a-6ae0c4457911-config-volume\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.334937 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c190402b-ed4d-48d2-984a-6ae0c4457911-config-volume\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.340058 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c190402b-ed4d-48d2-984a-6ae0c4457911-secret-volume\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.364645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgdf\" (UniqueName: \"kubernetes.io/projected/c190402b-ed4d-48d2-984a-6ae0c4457911-kube-api-access-zzgdf\") pod \"collect-profiles-29492265-rzjpq\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.472207 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:00 crc kubenswrapper[4772]: I0127 17:45:00.964510 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq"] Jan 27 17:45:00 crc kubenswrapper[4772]: W0127 17:45:00.977339 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc190402b_ed4d_48d2_984a_6ae0c4457911.slice/crio-5afbed58df246672c1baa506912246529be637fdd0a5d30927af1bb8c0f2373f WatchSource:0}: Error finding container 5afbed58df246672c1baa506912246529be637fdd0a5d30927af1bb8c0f2373f: Status 404 returned error can't find the container with id 5afbed58df246672c1baa506912246529be637fdd0a5d30927af1bb8c0f2373f Jan 27 17:45:01 crc kubenswrapper[4772]: I0127 17:45:01.950949 4772 generic.go:334] "Generic (PLEG): container finished" podID="c190402b-ed4d-48d2-984a-6ae0c4457911" containerID="b9f31bd1c3fa9ae295f35e740124fa42856302304faaba875d6731b304dbd918" exitCode=0 Jan 27 17:45:01 crc kubenswrapper[4772]: I0127 17:45:01.951208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" event={"ID":"c190402b-ed4d-48d2-984a-6ae0c4457911","Type":"ContainerDied","Data":"b9f31bd1c3fa9ae295f35e740124fa42856302304faaba875d6731b304dbd918"} Jan 27 17:45:01 crc kubenswrapper[4772]: I0127 17:45:01.951297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" event={"ID":"c190402b-ed4d-48d2-984a-6ae0c4457911","Type":"ContainerStarted","Data":"5afbed58df246672c1baa506912246529be637fdd0a5d30927af1bb8c0f2373f"} Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.329615 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.389968 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzgdf\" (UniqueName: \"kubernetes.io/projected/c190402b-ed4d-48d2-984a-6ae0c4457911-kube-api-access-zzgdf\") pod \"c190402b-ed4d-48d2-984a-6ae0c4457911\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.390260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c190402b-ed4d-48d2-984a-6ae0c4457911-secret-volume\") pod \"c190402b-ed4d-48d2-984a-6ae0c4457911\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.390312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c190402b-ed4d-48d2-984a-6ae0c4457911-config-volume\") pod \"c190402b-ed4d-48d2-984a-6ae0c4457911\" (UID: \"c190402b-ed4d-48d2-984a-6ae0c4457911\") " Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.391089 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c190402b-ed4d-48d2-984a-6ae0c4457911-config-volume" (OuterVolumeSpecName: "config-volume") pod "c190402b-ed4d-48d2-984a-6ae0c4457911" (UID: "c190402b-ed4d-48d2-984a-6ae0c4457911"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.396901 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c190402b-ed4d-48d2-984a-6ae0c4457911-kube-api-access-zzgdf" (OuterVolumeSpecName: "kube-api-access-zzgdf") pod "c190402b-ed4d-48d2-984a-6ae0c4457911" (UID: "c190402b-ed4d-48d2-984a-6ae0c4457911"). InnerVolumeSpecName "kube-api-access-zzgdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.397473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c190402b-ed4d-48d2-984a-6ae0c4457911-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c190402b-ed4d-48d2-984a-6ae0c4457911" (UID: "c190402b-ed4d-48d2-984a-6ae0c4457911"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.492515 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c190402b-ed4d-48d2-984a-6ae0c4457911-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.492561 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c190402b-ed4d-48d2-984a-6ae0c4457911-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.492578 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzgdf\" (UniqueName: \"kubernetes.io/projected/c190402b-ed4d-48d2-984a-6ae0c4457911-kube-api-access-zzgdf\") on node \"crc\" DevicePath \"\"" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.970881 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" event={"ID":"c190402b-ed4d-48d2-984a-6ae0c4457911","Type":"ContainerDied","Data":"5afbed58df246672c1baa506912246529be637fdd0a5d30927af1bb8c0f2373f"} Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.970924 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5afbed58df246672c1baa506912246529be637fdd0a5d30927af1bb8c0f2373f" Jan 27 17:45:03 crc kubenswrapper[4772]: I0127 17:45:03.970989 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492265-rzjpq" Jan 27 17:45:04 crc kubenswrapper[4772]: I0127 17:45:04.427131 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps"] Jan 27 17:45:04 crc kubenswrapper[4772]: I0127 17:45:04.434680 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492220-sqfps"] Jan 27 17:45:04 crc kubenswrapper[4772]: I0127 17:45:04.683048 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73038a7f-6c26-47b7-ad06-bd235e268224" path="/var/lib/kubelet/pods/73038a7f-6c26-47b7-ad06-bd235e268224/volumes" Jan 27 17:45:11 crc kubenswrapper[4772]: I0127 17:45:11.648072 4772 scope.go:117] "RemoveContainer" containerID="21fbf772d614ea3a35cb7d6244635ba0574a7b4a610726fba6575e200d3d3209" Jan 27 17:46:42 crc kubenswrapper[4772]: I0127 17:46:42.058956 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:46:42 crc kubenswrapper[4772]: I0127 17:46:42.059568 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.345314 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gj8cc"] Jan 27 17:46:50 crc kubenswrapper[4772]: E0127 17:46:50.346833 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c190402b-ed4d-48d2-984a-6ae0c4457911" containerName="collect-profiles" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.346851 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c190402b-ed4d-48d2-984a-6ae0c4457911" containerName="collect-profiles" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.347081 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c190402b-ed4d-48d2-984a-6ae0c4457911" containerName="collect-profiles" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.348682 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.353845 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gj8cc"] Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.547033 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gljr4\" (UniqueName: \"kubernetes.io/projected/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-kube-api-access-gljr4\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.547759 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-catalog-content\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.548019 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-utilities\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.649775 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-catalog-content\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.649868 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-utilities\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.649938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gljr4\" (UniqueName: \"kubernetes.io/projected/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-kube-api-access-gljr4\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.650395 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-catalog-content\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.650571 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-utilities\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.681140 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gljr4\" (UniqueName: \"kubernetes.io/projected/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-kube-api-access-gljr4\") pod \"certified-operators-gj8cc\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:50 crc kubenswrapper[4772]: I0127 17:46:50.975423 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:46:51 crc kubenswrapper[4772]: I0127 17:46:51.427725 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gj8cc"] Jan 27 17:46:52 crc kubenswrapper[4772]: I0127 17:46:52.065518 4772 generic.go:334] "Generic (PLEG): container finished" podID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerID="3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa" exitCode=0 Jan 27 17:46:52 crc kubenswrapper[4772]: I0127 17:46:52.065573 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerDied","Data":"3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa"} Jan 27 17:46:52 crc kubenswrapper[4772]: I0127 17:46:52.065861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerStarted","Data":"27d4dc9c40f39e670cd2a0ed4833fa464d1c4b2f7c7516785daa6f3707fdf84e"} Jan 27 17:46:52 crc kubenswrapper[4772]: I0127 17:46:52.067561 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:46:53 crc kubenswrapper[4772]: I0127 17:46:53.077857 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerStarted","Data":"943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4"} Jan 27 17:46:54 crc kubenswrapper[4772]: I0127 17:46:54.093525 4772 generic.go:334] "Generic (PLEG): container finished" podID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerID="943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4" exitCode=0 Jan 27 17:46:54 crc kubenswrapper[4772]: I0127 17:46:54.093858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerDied","Data":"943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4"} Jan 27 17:46:55 crc kubenswrapper[4772]: I0127 17:46:55.105302 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerStarted","Data":"6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400"} Jan 27 17:46:55 crc kubenswrapper[4772]: I0127 17:46:55.130456 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gj8cc" podStartSLOduration=2.551489794 podStartE2EDuration="5.130441511s" podCreationTimestamp="2026-01-27 17:46:50 +0000 UTC" firstStartedPulling="2026-01-27 17:46:52.067362604 +0000 UTC m=+9598.047971702" lastFinishedPulling="2026-01-27 17:46:54.646314321 +0000 UTC m=+9600.626923419" observedRunningTime="2026-01-27 17:46:55.122053872 +0000 UTC m=+9601.102662990" watchObservedRunningTime="2026-01-27 17:46:55.130441511 +0000 UTC m=+9601.111050609" Jan 27 17:47:00 crc kubenswrapper[4772]: I0127 17:47:00.976457 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:47:00 crc kubenswrapper[4772]: I0127 17:47:00.978269 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:47:01 crc kubenswrapper[4772]: I0127 17:47:01.034722 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:47:01 crc kubenswrapper[4772]: I0127 17:47:01.221075 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:47:01 crc kubenswrapper[4772]: I0127 17:47:01.280830 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gj8cc"] Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.188678 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gj8cc" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="registry-server" containerID="cri-o://6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400" gracePeriod=2 Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.643562 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.766663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-catalog-content\") pod \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.766762 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gljr4\" (UniqueName: \"kubernetes.io/projected/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-kube-api-access-gljr4\") pod \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.766802 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-utilities\") pod \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\" (UID: \"4d35ce00-8f21-48c4-ac86-b51879a0f1a0\") " Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.769239 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-utilities" (OuterVolumeSpecName: "utilities") pod "4d35ce00-8f21-48c4-ac86-b51879a0f1a0" (UID: "4d35ce00-8f21-48c4-ac86-b51879a0f1a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.777519 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-kube-api-access-gljr4" (OuterVolumeSpecName: "kube-api-access-gljr4") pod "4d35ce00-8f21-48c4-ac86-b51879a0f1a0" (UID: "4d35ce00-8f21-48c4-ac86-b51879a0f1a0"). InnerVolumeSpecName "kube-api-access-gljr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.868746 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gljr4\" (UniqueName: \"kubernetes.io/projected/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-kube-api-access-gljr4\") on node \"crc\" DevicePath \"\"" Jan 27 17:47:03 crc kubenswrapper[4772]: I0127 17:47:03.868967 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.091570 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d35ce00-8f21-48c4-ac86-b51879a0f1a0" (UID: "4d35ce00-8f21-48c4-ac86-b51879a0f1a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.175205 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d35ce00-8f21-48c4-ac86-b51879a0f1a0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.214836 4772 generic.go:334] "Generic (PLEG): container finished" podID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerID="6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400" exitCode=0 Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.214899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerDied","Data":"6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400"} Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.214916 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gj8cc" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.214939 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gj8cc" event={"ID":"4d35ce00-8f21-48c4-ac86-b51879a0f1a0","Type":"ContainerDied","Data":"27d4dc9c40f39e670cd2a0ed4833fa464d1c4b2f7c7516785daa6f3707fdf84e"} Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.214957 4772 scope.go:117] "RemoveContainer" containerID="6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.265523 4772 scope.go:117] "RemoveContainer" containerID="943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.269191 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gj8cc"] Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.279054 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gj8cc"] Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.293867 4772 scope.go:117] "RemoveContainer" containerID="3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.343269 4772 scope.go:117] "RemoveContainer" containerID="6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400" Jan 27 17:47:04 crc kubenswrapper[4772]: E0127 17:47:04.343754 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400\": container with ID starting with 6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400 not found: ID does not exist" containerID="6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.343808 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400"} err="failed to get container status \"6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400\": rpc error: code = NotFound desc = could not find container \"6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400\": container with ID starting with 6f1f87f9eb4015bb32f09732614123231dc5df6622454fbdaa7783dc972f7400 not found: ID does not exist" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.343840 4772 scope.go:117] "RemoveContainer" containerID="943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4" Jan 27 17:47:04 crc kubenswrapper[4772]: E0127 17:47:04.344108 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4\": container with ID starting with 943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4 not found: ID does not exist" containerID="943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.344141 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4"} err="failed to get container status \"943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4\": rpc error: code = NotFound desc = could not find container \"943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4\": container with ID starting with 943d9b9f50746b3fc3343e8866594a5e4a51e753c34e1d5555f821e76c7457c4 not found: ID does not exist" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.344159 4772 scope.go:117] "RemoveContainer" containerID="3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa" Jan 27 17:47:04 crc kubenswrapper[4772]: E0127 17:47:04.344707 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa\": container with ID starting with 3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa not found: ID does not exist" containerID="3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.344794 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa"} err="failed to get container status \"3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa\": rpc error: code = NotFound desc = could not find container \"3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa\": container with ID starting with 3563a42c4868e6463dd4fd0526fe31c9ba5bd18efc51c3bed319b5a0efe056aa not found: ID does not exist" Jan 27 17:47:04 crc kubenswrapper[4772]: I0127 17:47:04.677109 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" path="/var/lib/kubelet/pods/4d35ce00-8f21-48c4-ac86-b51879a0f1a0/volumes" Jan 27 17:47:12 crc kubenswrapper[4772]: I0127 17:47:12.058972 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:47:12 crc kubenswrapper[4772]: I0127 17:47:12.059685 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.058709 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.059575 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.060467 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.061775 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c95b967b4844ba74a00799daa1c360319d22d3be7e94ef067fa431b967e2d966"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.061912 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://c95b967b4844ba74a00799daa1c360319d22d3be7e94ef067fa431b967e2d966" gracePeriod=600 Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.644351 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="c95b967b4844ba74a00799daa1c360319d22d3be7e94ef067fa431b967e2d966" exitCode=0 Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.644450 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"c95b967b4844ba74a00799daa1c360319d22d3be7e94ef067fa431b967e2d966"} Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.644713 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025"} Jan 27 17:47:42 crc kubenswrapper[4772]: I0127 17:47:42.644734 4772 scope.go:117] "RemoveContainer" containerID="900c1aed2e392c4c453d12a872ae4215d2e07d45524a30632831cba37945e88d" Jan 27 17:49:42 crc kubenswrapper[4772]: I0127 17:49:42.058963 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:49:42 crc kubenswrapper[4772]: I0127 17:49:42.059848 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.714221 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7dvs7/must-gather-vr6mx"] Jan 27 17:50:07 crc kubenswrapper[4772]: E0127 17:50:07.716254 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="extract-content" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.716361 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="extract-content" Jan 27 17:50:07 crc kubenswrapper[4772]: E0127 17:50:07.716455 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="extract-utilities" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.716530 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="extract-utilities" Jan 27 17:50:07 crc kubenswrapper[4772]: E0127 17:50:07.716608 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="registry-server" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.716689 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="registry-server" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.716999 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d35ce00-8f21-48c4-ac86-b51879a0f1a0" containerName="registry-server" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.718363 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.726215 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7dvs7"/"default-dockercfg-rfw2p" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.726588 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7dvs7"/"kube-root-ca.crt" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.726756 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7dvs7"/"openshift-service-ca.crt" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.732331 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7dvs7/must-gather-vr6mx"] Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.801229 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-must-gather-output\") pod \"must-gather-vr6mx\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.801316 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwwmf\" (UniqueName: \"kubernetes.io/projected/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-kube-api-access-vwwmf\") pod \"must-gather-vr6mx\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.903635 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-must-gather-output\") pod \"must-gather-vr6mx\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.903798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwwmf\" (UniqueName: \"kubernetes.io/projected/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-kube-api-access-vwwmf\") pod \"must-gather-vr6mx\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.904069 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-must-gather-output\") pod \"must-gather-vr6mx\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:07 crc kubenswrapper[4772]: I0127 17:50:07.936426 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwwmf\" (UniqueName: \"kubernetes.io/projected/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-kube-api-access-vwwmf\") pod \"must-gather-vr6mx\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:08 crc kubenswrapper[4772]: I0127 17:50:08.052821 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:50:08 crc kubenswrapper[4772]: I0127 17:50:08.525668 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7dvs7/must-gather-vr6mx"] Jan 27 17:50:09 crc kubenswrapper[4772]: I0127 17:50:09.110032 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" event={"ID":"62afdf43-7cc5-4d53-aff8-2fd18fbfd493","Type":"ContainerStarted","Data":"73a87bca576775c52e6251c92bef7e518f388fa9a11be3bd3902b9b4c38f77b6"} Jan 27 17:50:12 crc kubenswrapper[4772]: I0127 17:50:12.059097 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:50:12 crc kubenswrapper[4772]: I0127 17:50:12.059910 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:50:16 crc kubenswrapper[4772]: I0127 17:50:16.182097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" event={"ID":"62afdf43-7cc5-4d53-aff8-2fd18fbfd493","Type":"ContainerStarted","Data":"7d2e1870707eb567a3ae7d6fc16a9352a0ad811fd3af2c67ad9db194e0056b53"} Jan 27 17:50:16 crc kubenswrapper[4772]: I0127 17:50:16.182717 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" event={"ID":"62afdf43-7cc5-4d53-aff8-2fd18fbfd493","Type":"ContainerStarted","Data":"daea92a47f41c2b8eac7ad3a9eb9829829dd28a915856ca1714eacc250cb2602"} Jan 27 17:50:16 crc kubenswrapper[4772]: I0127 17:50:16.217729 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" podStartSLOduration=2.695294125 podStartE2EDuration="9.217701194s" podCreationTimestamp="2026-01-27 17:50:07 +0000 UTC" firstStartedPulling="2026-01-27 17:50:08.534806985 +0000 UTC m=+9794.515416093" lastFinishedPulling="2026-01-27 17:50:15.057214064 +0000 UTC m=+9801.037823162" observedRunningTime="2026-01-27 17:50:16.207139393 +0000 UTC m=+9802.187748491" watchObservedRunningTime="2026-01-27 17:50:16.217701194 +0000 UTC m=+9802.198310312" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.302019 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7dvs7/crc-debug-fch7g"] Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.306278 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.360845 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b7e9e60-6e19-49be-866a-fba6fd4b0780-host\") pod \"crc-debug-fch7g\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.360981 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpdvc\" (UniqueName: \"kubernetes.io/projected/3b7e9e60-6e19-49be-866a-fba6fd4b0780-kube-api-access-xpdvc\") pod \"crc-debug-fch7g\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.462626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpdvc\" (UniqueName: \"kubernetes.io/projected/3b7e9e60-6e19-49be-866a-fba6fd4b0780-kube-api-access-xpdvc\") pod \"crc-debug-fch7g\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.462790 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b7e9e60-6e19-49be-866a-fba6fd4b0780-host\") pod \"crc-debug-fch7g\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.462931 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b7e9e60-6e19-49be-866a-fba6fd4b0780-host\") pod \"crc-debug-fch7g\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.488942 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpdvc\" (UniqueName: \"kubernetes.io/projected/3b7e9e60-6e19-49be-866a-fba6fd4b0780-kube-api-access-xpdvc\") pod \"crc-debug-fch7g\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:20 crc kubenswrapper[4772]: I0127 17:50:20.647365 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:21 crc kubenswrapper[4772]: I0127 17:50:21.224730 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" event={"ID":"3b7e9e60-6e19-49be-866a-fba6fd4b0780","Type":"ContainerStarted","Data":"ab84431cd9516b39dc9cfd9729d8b5ee60ad8c495c7dcade7941ee04e47fe49d"} Jan 27 17:50:32 crc kubenswrapper[4772]: I0127 17:50:32.355298 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" event={"ID":"3b7e9e60-6e19-49be-866a-fba6fd4b0780","Type":"ContainerStarted","Data":"1ea32c15c368715377429d9d913ced250c3cced4bf09ea7a9096c6fa505dcd27"} Jan 27 17:50:32 crc kubenswrapper[4772]: I0127 17:50:32.379981 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" podStartSLOduration=1.298720013 podStartE2EDuration="12.379961991s" podCreationTimestamp="2026-01-27 17:50:20 +0000 UTC" firstStartedPulling="2026-01-27 17:50:20.682106669 +0000 UTC m=+9806.662715797" lastFinishedPulling="2026-01-27 17:50:31.763348677 +0000 UTC m=+9817.743957775" observedRunningTime="2026-01-27 17:50:32.367924557 +0000 UTC m=+9818.348533655" watchObservedRunningTime="2026-01-27 17:50:32.379961991 +0000 UTC m=+9818.360571089" Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.057983 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.058504 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.058556 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.059362 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025"} pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.059423 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" containerID="cri-o://b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" gracePeriod=600 Jan 27 17:50:42 crc kubenswrapper[4772]: E0127 17:50:42.370472 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.442883 4772 generic.go:334] "Generic (PLEG): container finished" podID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" exitCode=0 Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.442945 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerDied","Data":"b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025"} Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.442988 4772 scope.go:117] "RemoveContainer" containerID="c95b967b4844ba74a00799daa1c360319d22d3be7e94ef067fa431b967e2d966" Jan 27 17:50:42 crc kubenswrapper[4772]: I0127 17:50:42.443840 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:50:42 crc kubenswrapper[4772]: E0127 17:50:42.444230 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:50:44 crc kubenswrapper[4772]: I0127 17:50:44.781563 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wwjcs"] Jan 27 17:50:44 crc kubenswrapper[4772]: I0127 17:50:44.784380 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:44 crc kubenswrapper[4772]: I0127 17:50:44.798466 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwjcs"] Jan 27 17:50:44 crc kubenswrapper[4772]: I0127 17:50:44.936958 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8q5\" (UniqueName: \"kubernetes.io/projected/4bec6411-aae2-47f7-a721-a39d6d381c17-kube-api-access-9m8q5\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:44 crc kubenswrapper[4772]: I0127 17:50:44.937041 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-utilities\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:44 crc kubenswrapper[4772]: I0127 17:50:44.937146 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-catalog-content\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.038680 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-catalog-content\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.039211 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8q5\" (UniqueName: \"kubernetes.io/projected/4bec6411-aae2-47f7-a721-a39d6d381c17-kube-api-access-9m8q5\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.039261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-utilities\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.039647 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-catalog-content\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.039790 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-utilities\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.071511 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8q5\" (UniqueName: \"kubernetes.io/projected/4bec6411-aae2-47f7-a721-a39d6d381c17-kube-api-access-9m8q5\") pod \"redhat-operators-wwjcs\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.108478 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:45 crc kubenswrapper[4772]: I0127 17:50:45.645001 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwjcs"] Jan 27 17:50:46 crc kubenswrapper[4772]: I0127 17:50:46.481037 4772 generic.go:334] "Generic (PLEG): container finished" podID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerID="89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b" exitCode=0 Jan 27 17:50:46 crc kubenswrapper[4772]: I0127 17:50:46.481112 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerDied","Data":"89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b"} Jan 27 17:50:46 crc kubenswrapper[4772]: I0127 17:50:46.481865 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerStarted","Data":"bc5839b7d468657c30d46f03fedecb8b4e0ef4ce5baaaea4f34f78a6a2e457ca"} Jan 27 17:50:49 crc kubenswrapper[4772]: I0127 17:50:49.505460 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerStarted","Data":"407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06"} Jan 27 17:50:52 crc kubenswrapper[4772]: I0127 17:50:52.662851 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:50:52 crc kubenswrapper[4772]: E0127 17:50:52.663681 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:50:53 crc kubenswrapper[4772]: I0127 17:50:53.329644 4772 generic.go:334] "Generic (PLEG): container finished" podID="3b7e9e60-6e19-49be-866a-fba6fd4b0780" containerID="1ea32c15c368715377429d9d913ced250c3cced4bf09ea7a9096c6fa505dcd27" exitCode=0 Jan 27 17:50:53 crc kubenswrapper[4772]: I0127 17:50:53.329813 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" event={"ID":"3b7e9e60-6e19-49be-866a-fba6fd4b0780","Type":"ContainerDied","Data":"1ea32c15c368715377429d9d913ced250c3cced4bf09ea7a9096c6fa505dcd27"} Jan 27 17:50:53 crc kubenswrapper[4772]: I0127 17:50:53.334608 4772 generic.go:334] "Generic (PLEG): container finished" podID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerID="407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06" exitCode=0 Jan 27 17:50:53 crc kubenswrapper[4772]: I0127 17:50:53.334647 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerDied","Data":"407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06"} Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.344285 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerStarted","Data":"fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd"} Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.378141 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wwjcs" podStartSLOduration=2.85193684 podStartE2EDuration="10.378113555s" podCreationTimestamp="2026-01-27 17:50:44 +0000 UTC" firstStartedPulling="2026-01-27 17:50:46.484785628 +0000 UTC m=+9832.465394726" lastFinishedPulling="2026-01-27 17:50:54.010962343 +0000 UTC m=+9839.991571441" observedRunningTime="2026-01-27 17:50:54.371753413 +0000 UTC m=+9840.352362531" watchObservedRunningTime="2026-01-27 17:50:54.378113555 +0000 UTC m=+9840.358722673" Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.456596 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.506291 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7dvs7/crc-debug-fch7g"] Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.510432 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7dvs7/crc-debug-fch7g"] Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.649550 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b7e9e60-6e19-49be-866a-fba6fd4b0780-host\") pod \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.649696 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b7e9e60-6e19-49be-866a-fba6fd4b0780-host" (OuterVolumeSpecName: "host") pod "3b7e9e60-6e19-49be-866a-fba6fd4b0780" (UID: "3b7e9e60-6e19-49be-866a-fba6fd4b0780"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.650116 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpdvc\" (UniqueName: \"kubernetes.io/projected/3b7e9e60-6e19-49be-866a-fba6fd4b0780-kube-api-access-xpdvc\") pod \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\" (UID: \"3b7e9e60-6e19-49be-866a-fba6fd4b0780\") " Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.650778 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3b7e9e60-6e19-49be-866a-fba6fd4b0780-host\") on node \"crc\" DevicePath \"\"" Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.658420 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7e9e60-6e19-49be-866a-fba6fd4b0780-kube-api-access-xpdvc" (OuterVolumeSpecName: "kube-api-access-xpdvc") pod "3b7e9e60-6e19-49be-866a-fba6fd4b0780" (UID: "3b7e9e60-6e19-49be-866a-fba6fd4b0780"). InnerVolumeSpecName "kube-api-access-xpdvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.677154 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7e9e60-6e19-49be-866a-fba6fd4b0780" path="/var/lib/kubelet/pods/3b7e9e60-6e19-49be-866a-fba6fd4b0780/volumes" Jan 27 17:50:54 crc kubenswrapper[4772]: I0127 17:50:54.753488 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpdvc\" (UniqueName: \"kubernetes.io/projected/3b7e9e60-6e19-49be-866a-fba6fd4b0780-kube-api-access-xpdvc\") on node \"crc\" DevicePath \"\"" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.109334 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.109404 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.353675 4772 scope.go:117] "RemoveContainer" containerID="1ea32c15c368715377429d9d913ced250c3cced4bf09ea7a9096c6fa505dcd27" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.353801 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-fch7g" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.728787 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7dvs7/crc-debug-5r4qc"] Jan 27 17:50:55 crc kubenswrapper[4772]: E0127 17:50:55.729603 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7e9e60-6e19-49be-866a-fba6fd4b0780" containerName="container-00" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.729625 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7e9e60-6e19-49be-866a-fba6fd4b0780" containerName="container-00" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.729867 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7e9e60-6e19-49be-866a-fba6fd4b0780" containerName="container-00" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.730662 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.873025 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf8sr\" (UniqueName: \"kubernetes.io/projected/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-kube-api-access-jf8sr\") pod \"crc-debug-5r4qc\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.873222 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-host\") pod \"crc-debug-5r4qc\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.974801 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf8sr\" (UniqueName: \"kubernetes.io/projected/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-kube-api-access-jf8sr\") pod \"crc-debug-5r4qc\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.974874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-host\") pod \"crc-debug-5r4qc\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.975041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-host\") pod \"crc-debug-5r4qc\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:55 crc kubenswrapper[4772]: I0127 17:50:55.996813 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf8sr\" (UniqueName: \"kubernetes.io/projected/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-kube-api-access-jf8sr\") pod \"crc-debug-5r4qc\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:56 crc kubenswrapper[4772]: I0127 17:50:56.051647 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:56 crc kubenswrapper[4772]: W0127 17:50:56.102328 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ac0d9e_ba08_48c0_9a6c_307c3438b86e.slice/crio-2567decd5ebaaa4f4f481700f74363f342d3f4cbc7cd8ddc7aefddfba8d1986a WatchSource:0}: Error finding container 2567decd5ebaaa4f4f481700f74363f342d3f4cbc7cd8ddc7aefddfba8d1986a: Status 404 returned error can't find the container with id 2567decd5ebaaa4f4f481700f74363f342d3f4cbc7cd8ddc7aefddfba8d1986a Jan 27 17:50:56 crc kubenswrapper[4772]: I0127 17:50:56.163290 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wwjcs" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" probeResult="failure" output=< Jan 27 17:50:56 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 17:50:56 crc kubenswrapper[4772]: > Jan 27 17:50:56 crc kubenswrapper[4772]: I0127 17:50:56.369108 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" event={"ID":"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e","Type":"ContainerStarted","Data":"1323a4c6a49cb722cb00e0194b58825dbab94b8aff7bf910a6baddc9ff7aedd8"} Jan 27 17:50:56 crc kubenswrapper[4772]: I0127 17:50:56.369146 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" event={"ID":"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e","Type":"ContainerStarted","Data":"2567decd5ebaaa4f4f481700f74363f342d3f4cbc7cd8ddc7aefddfba8d1986a"} Jan 27 17:50:56 crc kubenswrapper[4772]: I0127 17:50:56.390494 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" podStartSLOduration=1.390460085 podStartE2EDuration="1.390460085s" podCreationTimestamp="2026-01-27 17:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 17:50:56.381288233 +0000 UTC m=+9842.361897361" watchObservedRunningTime="2026-01-27 17:50:56.390460085 +0000 UTC m=+9842.371069183" Jan 27 17:50:56 crc kubenswrapper[4772]: E0127 17:50:56.587818 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ac0d9e_ba08_48c0_9a6c_307c3438b86e.slice/crio-conmon-1323a4c6a49cb722cb00e0194b58825dbab94b8aff7bf910a6baddc9ff7aedd8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2ac0d9e_ba08_48c0_9a6c_307c3438b86e.slice/crio-1323a4c6a49cb722cb00e0194b58825dbab94b8aff7bf910a6baddc9ff7aedd8.scope\": RecentStats: unable to find data in memory cache]" Jan 27 17:50:57 crc kubenswrapper[4772]: I0127 17:50:57.395588 4772 generic.go:334] "Generic (PLEG): container finished" podID="a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" containerID="1323a4c6a49cb722cb00e0194b58825dbab94b8aff7bf910a6baddc9ff7aedd8" exitCode=1 Jan 27 17:50:57 crc kubenswrapper[4772]: I0127 17:50:57.395840 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" event={"ID":"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e","Type":"ContainerDied","Data":"1323a4c6a49cb722cb00e0194b58825dbab94b8aff7bf910a6baddc9ff7aedd8"} Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.511537 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.545513 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7dvs7/crc-debug-5r4qc"] Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.554458 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7dvs7/crc-debug-5r4qc"] Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.621637 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-host\") pod \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.621744 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-host" (OuterVolumeSpecName: "host") pod "a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" (UID: "a2ac0d9e-ba08-48c0-9a6c-307c3438b86e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.621793 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf8sr\" (UniqueName: \"kubernetes.io/projected/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-kube-api-access-jf8sr\") pod \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\" (UID: \"a2ac0d9e-ba08-48c0-9a6c-307c3438b86e\") " Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.622364 4772 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-host\") on node \"crc\" DevicePath \"\"" Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.629430 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-kube-api-access-jf8sr" (OuterVolumeSpecName: "kube-api-access-jf8sr") pod "a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" (UID: "a2ac0d9e-ba08-48c0-9a6c-307c3438b86e"). InnerVolumeSpecName "kube-api-access-jf8sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.679055 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" path="/var/lib/kubelet/pods/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e/volumes" Jan 27 17:50:58 crc kubenswrapper[4772]: I0127 17:50:58.723990 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf8sr\" (UniqueName: \"kubernetes.io/projected/a2ac0d9e-ba08-48c0-9a6c-307c3438b86e-kube-api-access-jf8sr\") on node \"crc\" DevicePath \"\"" Jan 27 17:50:59 crc kubenswrapper[4772]: I0127 17:50:59.415092 4772 scope.go:117] "RemoveContainer" containerID="1323a4c6a49cb722cb00e0194b58825dbab94b8aff7bf910a6baddc9ff7aedd8" Jan 27 17:50:59 crc kubenswrapper[4772]: I0127 17:50:59.415204 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/crc-debug-5r4qc" Jan 27 17:51:06 crc kubenswrapper[4772]: I0127 17:51:06.153611 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wwjcs" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" probeResult="failure" output=< Jan 27 17:51:06 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 17:51:06 crc kubenswrapper[4772]: > Jan 27 17:51:06 crc kubenswrapper[4772]: I0127 17:51:06.664926 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:51:06 crc kubenswrapper[4772]: E0127 17:51:06.665528 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:51:16 crc kubenswrapper[4772]: I0127 17:51:16.330393 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wwjcs" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" probeResult="failure" output=< Jan 27 17:51:16 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Jan 27 17:51:16 crc kubenswrapper[4772]: > Jan 27 17:51:17 crc kubenswrapper[4772]: I0127 17:51:17.663253 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:51:17 crc kubenswrapper[4772]: E0127 17:51:17.663771 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:51:25 crc kubenswrapper[4772]: I0127 17:51:25.446152 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:51:25 crc kubenswrapper[4772]: I0127 17:51:25.500489 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:51:25 crc kubenswrapper[4772]: I0127 17:51:25.688820 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwjcs"] Jan 27 17:51:26 crc kubenswrapper[4772]: I0127 17:51:26.659578 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wwjcs" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" containerID="cri-o://fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd" gracePeriod=2 Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.378687 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.511431 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-utilities\") pod \"4bec6411-aae2-47f7-a721-a39d6d381c17\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.511867 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-catalog-content\") pod \"4bec6411-aae2-47f7-a721-a39d6d381c17\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.511997 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m8q5\" (UniqueName: \"kubernetes.io/projected/4bec6411-aae2-47f7-a721-a39d6d381c17-kube-api-access-9m8q5\") pod \"4bec6411-aae2-47f7-a721-a39d6d381c17\" (UID: \"4bec6411-aae2-47f7-a721-a39d6d381c17\") " Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.512473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-utilities" (OuterVolumeSpecName: "utilities") pod "4bec6411-aae2-47f7-a721-a39d6d381c17" (UID: "4bec6411-aae2-47f7-a721-a39d6d381c17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.512658 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.529402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bec6411-aae2-47f7-a721-a39d6d381c17-kube-api-access-9m8q5" (OuterVolumeSpecName: "kube-api-access-9m8q5") pod "4bec6411-aae2-47f7-a721-a39d6d381c17" (UID: "4bec6411-aae2-47f7-a721-a39d6d381c17"). InnerVolumeSpecName "kube-api-access-9m8q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.614741 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m8q5\" (UniqueName: \"kubernetes.io/projected/4bec6411-aae2-47f7-a721-a39d6d381c17-kube-api-access-9m8q5\") on node \"crc\" DevicePath \"\"" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.670106 4772 generic.go:334] "Generic (PLEG): container finished" podID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerID="fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd" exitCode=0 Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.670148 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerDied","Data":"fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd"} Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.670194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjcs" event={"ID":"4bec6411-aae2-47f7-a721-a39d6d381c17","Type":"ContainerDied","Data":"bc5839b7d468657c30d46f03fedecb8b4e0ef4ce5baaaea4f34f78a6a2e457ca"} Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.670229 4772 scope.go:117] "RemoveContainer" containerID="fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.670292 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjcs" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.676015 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bec6411-aae2-47f7-a721-a39d6d381c17" (UID: "4bec6411-aae2-47f7-a721-a39d6d381c17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.691852 4772 scope.go:117] "RemoveContainer" containerID="407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.707279 4772 scope.go:117] "RemoveContainer" containerID="89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.716206 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bec6411-aae2-47f7-a721-a39d6d381c17-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.758052 4772 scope.go:117] "RemoveContainer" containerID="fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd" Jan 27 17:51:27 crc kubenswrapper[4772]: E0127 17:51:27.758503 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd\": container with ID starting with fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd not found: ID does not exist" containerID="fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.758570 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd"} err="failed to get container status \"fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd\": rpc error: code = NotFound desc = could not find container \"fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd\": container with ID starting with fc9c3049ded284cad5528fcbc82380287f3e9e60849f185709b1cbf6091758dd not found: ID does not exist" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.758603 4772 scope.go:117] "RemoveContainer" containerID="407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06" Jan 27 17:51:27 crc kubenswrapper[4772]: E0127 17:51:27.759022 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06\": container with ID starting with 407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06 not found: ID does not exist" containerID="407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.759059 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06"} err="failed to get container status \"407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06\": rpc error: code = NotFound desc = could not find container \"407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06\": container with ID starting with 407a6958420e1bfb3d0405a21d3152ae16fe037a897624d55dc2bc173169ab06 not found: ID does not exist" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.759073 4772 scope.go:117] "RemoveContainer" containerID="89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b" Jan 27 17:51:27 crc kubenswrapper[4772]: E0127 17:51:27.759461 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b\": container with ID starting with 89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b not found: ID does not exist" containerID="89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b" Jan 27 17:51:27 crc kubenswrapper[4772]: I0127 17:51:27.759493 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b"} err="failed to get container status \"89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b\": rpc error: code = NotFound desc = could not find container \"89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b\": container with ID starting with 89b9352f77c65a63cab44c13397c73a1d6c97dafb21ef79db13cbb8873b80e0b not found: ID does not exist" Jan 27 17:51:28 crc kubenswrapper[4772]: I0127 17:51:28.028277 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwjcs"] Jan 27 17:51:28 crc kubenswrapper[4772]: I0127 17:51:28.037858 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wwjcs"] Jan 27 17:51:28 crc kubenswrapper[4772]: I0127 17:51:28.673996 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" path="/var/lib/kubelet/pods/4bec6411-aae2-47f7-a721-a39d6d381c17/volumes" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.019975 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7dcb7f9846-lrk6t_5c77e7c3-5320-4fa6-810d-bc819a6f7b03/barbican-api/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.247516 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7dcb7f9846-lrk6t_5c77e7c3-5320-4fa6-810d-bc819a6f7b03/barbican-api-log/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.269935 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75df4b6d74-xpp9t_2da34b58-6b43-4e25-bdec-39985c344819/barbican-keystone-listener/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.434893 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75df4b6d74-xpp9t_2da34b58-6b43-4e25-bdec-39985c344819/barbican-keystone-listener-log/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.500045 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69b64c5dd7-dj9pw_0d49f4dc-fd69-4e43-9866-87af6da31197/barbican-worker/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.578922 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69b64c5dd7-dj9pw_0d49f4dc-fd69-4e43-9866-87af6da31197/barbican-worker-log/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.690350 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a46febaf-97b6-4ed3-8958-316e2a542a5f/cinder-api-log/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.730624 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a46febaf-97b6-4ed3-8958-316e2a542a5f/cinder-api/0.log" Jan 27 17:51:30 crc kubenswrapper[4772]: I0127 17:51:30.984694 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_37170132-cd9f-44e7-827d-b98486cefb39/probe/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.025239 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_37170132-cd9f-44e7-827d-b98486cefb39/cinder-backup/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.075894 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_66512484-80ba-4887-b9a9-9cc87a65ad18/cinder-scheduler/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.233525 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_66512484-80ba-4887-b9a9-9cc87a65ad18/probe/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.284711 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_7b6810b2-bc50-486d-9a87-cf4cd50d33c5/cinder-volume/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.327316 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_7b6810b2-bc50-486d-9a87-cf4cd50d33c5/probe/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.477111 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8896c5c8c-s6z7x_22e08251-8371-4470-bc3e-d88d673d56f3/init/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.644553 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8896c5c8c-s6z7x_22e08251-8371-4470-bc3e-d88d673d56f3/init/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.663571 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:51:31 crc kubenswrapper[4772]: E0127 17:51:31.663865 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.675119 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8896c5c8c-s6z7x_22e08251-8371-4470-bc3e-d88d673d56f3/dnsmasq-dns/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.732574 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a7917063-9e04-41e8-8fb9-e8383f839bd6/glance-httpd/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.839386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a7917063-9e04-41e8-8fb9-e8383f839bd6/glance-log/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.937360 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a1535f57-0540-45ea-b53c-1b4cac461cf3/glance-log/0.log" Jan 27 17:51:31 crc kubenswrapper[4772]: I0127 17:51:31.965273 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a1535f57-0540-45ea-b53c-1b4cac461cf3/glance-httpd/0.log" Jan 27 17:51:32 crc kubenswrapper[4772]: I0127 17:51:32.191772 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29492221-sbxjp_8459d055-62d3-4699-b477-ea15946b982c/keystone-cron/0.log" Jan 27 17:51:32 crc kubenswrapper[4772]: I0127 17:51:32.191947 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56b5f9d6fc-hmz72_f6255916-357e-42c6-b936-27151f6b2260/keystone-api/0.log" Jan 27 17:51:32 crc kubenswrapper[4772]: I0127 17:51:32.473386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_7db35434-01e2-470d-bb27-8e30189936b3/adoption/0.log" Jan 27 17:51:32 crc kubenswrapper[4772]: I0127 17:51:32.820734 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8bf5d4b7c-bfg78_b5a89957-107d-449b-b438-2215fd4ed522/neutron-api/0.log" Jan 27 17:51:32 crc kubenswrapper[4772]: I0127 17:51:32.883277 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8bf5d4b7c-bfg78_b5a89957-107d-449b-b438-2215fd4ed522/neutron-httpd/0.log" Jan 27 17:51:33 crc kubenswrapper[4772]: I0127 17:51:33.177588 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f178d7e3-af69-4014-8209-5e766a130997/nova-api-api/0.log" Jan 27 17:51:33 crc kubenswrapper[4772]: I0127 17:51:33.266457 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f178d7e3-af69-4014-8209-5e766a130997/nova-api-log/0.log" Jan 27 17:51:33 crc kubenswrapper[4772]: I0127 17:51:33.489959 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3331c1dd-ff2d-4a41-9cb3-731297ae0dc3/nova-cell0-conductor-conductor/0.log" Jan 27 17:51:33 crc kubenswrapper[4772]: I0127 17:51:33.617718 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_9549c89c-f55f-484d-80b2-ca1ad19bf758/nova-cell1-conductor-conductor/0.log" Jan 27 17:51:33 crc kubenswrapper[4772]: I0127 17:51:33.853686 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_db0a7ce2-c175-4632-abe5-f35a6b5ce680/nova-cell1-novncproxy-novncproxy/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.043259 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3a983cf0-2c51-4d6a-af53-f115f3a57360/nova-metadata-log/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.351077 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_3a983cf0-2c51-4d6a-af53-f115f3a57360/nova-metadata-metadata/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.418796 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c9832ca2-4d35-4533-bdb3-7ac3773e5242/nova-scheduler-scheduler/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.550125 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6d96bf4746-x9c97_9371d269-02b3-4049-aeea-4fd56c648b89/init/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.753899 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6d96bf4746-x9c97_9371d269-02b3-4049-aeea-4fd56c648b89/octavia-api-provider-agent/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.789663 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6d96bf4746-x9c97_9371d269-02b3-4049-aeea-4fd56c648b89/init/0.log" Jan 27 17:51:34 crc kubenswrapper[4772]: I0127 17:51:34.996847 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6d96bf4746-x9c97_9371d269-02b3-4049-aeea-4fd56c648b89/octavia-api/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.365422 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jmckq_d32d3e24-6f03-46cc-b7ae-61383778b183/init/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.581917 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jmckq_d32d3e24-6f03-46cc-b7ae-61383778b183/init/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.636386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-6v7mr_b6f78da3-da1c-4e27-ab65-581c656f74d9/init/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.673837 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-jmckq_d32d3e24-6f03-46cc-b7ae-61383778b183/octavia-healthmanager/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.793133 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-6v7mr_b6f78da3-da1c-4e27-ab65-581c656f74d9/init/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.834728 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-6v7mr_b6f78da3-da1c-4e27-ab65-581c656f74d9/octavia-housekeeping/0.log" Jan 27 17:51:35 crc kubenswrapper[4772]: I0127 17:51:35.901429 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-4wcv5_c163e8de-ea19-4a1c-8791-8659b9a09ba3/init/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.072376 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-4wcv5_c163e8de-ea19-4a1c-8791-8659b9a09ba3/init/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.080012 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-4wcv5_c163e8de-ea19-4a1c-8791-8659b9a09ba3/octavia-rsyslog/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.163863 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-4q859_ab431622-b724-4ed4-be2b-67ec8b5956db/init/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.383331 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-4q859_ab431622-b724-4ed4-be2b-67ec8b5956db/init/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.545426 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be7c27a4-64d2-4581-8271-5aaf74103b04/mysql-bootstrap/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.564238 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-4q859_ab431622-b724-4ed4-be2b-67ec8b5956db/octavia-worker/0.log" Jan 27 17:51:36 crc kubenswrapper[4772]: I0127 17:51:36.951021 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be7c27a4-64d2-4581-8271-5aaf74103b04/mysql-bootstrap/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.014048 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eefd7ff4-5222-45cf-aaad-20ebfd50a2ff/mysql-bootstrap/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.022348 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_be7c27a4-64d2-4581-8271-5aaf74103b04/galera/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.245394 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eefd7ff4-5222-45cf-aaad-20ebfd50a2ff/mysql-bootstrap/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.289842 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_e05a90a8-dbbb-4e24-ac89-f30360482af9/openstackclient/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.315345 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_eefd7ff4-5222-45cf-aaad-20ebfd50a2ff/galera/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.392186 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_36e53353-e817-4d3d-878e-2b34f7c9192f/memcached/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.509676 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-jv694_febb140e-d26e-43db-9924-0f06739b9a4a/ovn-controller/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.545257 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-sx2ff_3d12bbd4-3d0b-444b-a462-b620a7a5d73d/openstack-network-exporter/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.639123 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s5xbr_7e5eabb2-229a-4d75-b62d-65be688f753a/ovsdb-server-init/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.873599 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s5xbr_7e5eabb2-229a-4d75-b62d-65be688f753a/ovsdb-server/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.874062 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s5xbr_7e5eabb2-229a-4d75-b62d-65be688f753a/ovsdb-server-init/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.878195 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-s5xbr_7e5eabb2-229a-4d75-b62d-65be688f753a/ovs-vswitchd/0.log" Jan 27 17:51:37 crc kubenswrapper[4772]: I0127 17:51:37.909449 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_6d673b09-a15f-48fc-b399-212dc30fce29/adoption/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.059369 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9c221996-e15f-4fe3-bc62-98aac08f546f/ovn-northd/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.088362 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_9c221996-e15f-4fe3-bc62-98aac08f546f/openstack-network-exporter/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.179836 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c7dba285-1db4-44d8-bdf4-9de6e8d80adb/openstack-network-exporter/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.254794 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c7dba285-1db4-44d8-bdf4-9de6e8d80adb/ovsdbserver-nb/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.503561 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d647bfb5-69e6-4b10-96ac-5f7fcd72514f/ovsdbserver-nb/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.511570 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d647bfb5-69e6-4b10-96ac-5f7fcd72514f/openstack-network-exporter/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.635581 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3e19e84f-6d5e-455b-be78-ae3f04c925b7/openstack-network-exporter/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.655979 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_3e19e84f-6d5e-455b-be78-ae3f04c925b7/ovsdbserver-nb/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.730554 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f70d2878-d629-4772-b2a4-697fe18a3760/openstack-network-exporter/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.809872 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f70d2878-d629-4772-b2a4-697fe18a3760/ovsdbserver-sb/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.880676 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_e6e7e5c6-90b8-4de9-ae6a-11034616734a/openstack-network-exporter/0.log" Jan 27 17:51:38 crc kubenswrapper[4772]: I0127 17:51:38.919012 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_e6e7e5c6-90b8-4de9-ae6a-11034616734a/ovsdbserver-sb/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.003367 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2/openstack-network-exporter/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.051794 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e1fdf1bd-2cdb-4164-84c4-5c780a0a95b2/ovsdbserver-sb/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.128179 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ffd9fc5c6-99g52_c8daf690-d375-4d0a-b763-4b610aaeac45/placement-api/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.205499 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6ffd9fc5c6-99g52_c8daf690-d375-4d0a-b763-4b610aaeac45/placement-log/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.244682 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88/setup-container/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.416054 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88/setup-container/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.442891 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3b5f224-602e-454a-b35e-2e55160016b5/setup-container/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.453571 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1c2904f0-5ba8-4bb4-9952-ca1ee06a4d88/rabbitmq/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.637466 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3b5f224-602e-454a-b35e-2e55160016b5/setup-container/0.log" Jan 27 17:51:39 crc kubenswrapper[4772]: I0127 17:51:39.667352 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f3b5f224-602e-454a-b35e-2e55160016b5/rabbitmq/0.log" Jan 27 17:51:43 crc kubenswrapper[4772]: I0127 17:51:43.663118 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:51:43 crc kubenswrapper[4772]: E0127 17:51:43.663929 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:51:54 crc kubenswrapper[4772]: I0127 17:51:54.673930 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:51:54 crc kubenswrapper[4772]: E0127 17:51:54.676269 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.473282 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/util/0.log" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.712267 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/pull/0.log" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.713313 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/pull/0.log" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.739136 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/util/0.log" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.863525 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/util/0.log" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.863709 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/pull/0.log" Jan 27 17:51:58 crc kubenswrapper[4772]: I0127 17:51:58.897096 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6ebbdeb42ee59bc46cd5a9affeefe7a428e186e004b54bc44478e0857b2c5g4_0f29ea34-f593-4806-b5f6-2f9976c46a12/extract/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.154481 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-65ff799cfd-t42n9_674f4da6-f50d-4bab-808d-56ab3b9e2cb4/manager/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.234023 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-655bf9cfbb-cgh7j_fde95124-892b-411a-ba05-fa70927c8838/manager/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.314702 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-77554cdc5c-tkr6j_d395f105-54f0-4497-a119-57802be313a3/manager/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.458607 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-67dd55ff59-hgscb_fb300814-fca7-4419-ac6e-c08b33edd4be/manager/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.499654 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-575ffb885b-mtd9d_4c63a702-50b9-42f3-858e-7e27da0a8d8f/manager/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.665091 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-jcb4p_e85aef3a-e235-473c-94cc-1f6237798b3e/manager/0.log" Jan 27 17:51:59 crc kubenswrapper[4772]: I0127 17:51:59.881747 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-768b776ffb-sxbjn_2df85221-33ed-49be-949c-516810279e4d/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.249381 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-849fcfbb6b-tvrx9_27ec5082-c170-465b-b3a3-1f27a545fd71/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.304321 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-55f684fd56-wzjrz_783d8159-e67a-4796-83d8-4eff27d79505/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.326384 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d75bc88d5-t54fr_e7465bd0-3b6e-4199-9ee6-28b512198847/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.489330 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6b9fb5fdcb-zhd82_b73c175a-e89e-434f-996a-65c1140bb8dd/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.601540 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7ffd8d76d4-gcpj4_0a88aa66-b634-44ee-8e5b-bfeacb765e57/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.884643 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7875d7675-ktfbt_80584c24-3c75-4624-802f-e608f640eeaa/manager/0.log" Jan 27 17:52:00 crc kubenswrapper[4772]: I0127 17:52:00.918141 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-ddcbfd695-6wltn_e7fc5297-101a-496e-a7c6-e7296e08a5af/manager/0.log" Jan 27 17:52:01 crc kubenswrapper[4772]: I0127 17:52:01.063921 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854q994c_1389813b-42ea-433f-820c-e5b8b41713d7/manager/0.log" Jan 27 17:52:01 crc kubenswrapper[4772]: I0127 17:52:01.169753 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6fb647f7d4-gkjgt_939a692e-65d1-4be8-b78a-22ae83072d51/operator/0.log" Jan 27 17:52:01 crc kubenswrapper[4772]: I0127 17:52:01.408415 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vs9rk_96fcf3a5-2584-4590-8057-9c18a9866bd4/registry-server/0.log" Jan 27 17:52:01 crc kubenswrapper[4772]: I0127 17:52:01.737930 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6f75f45d54-ww79v_e76712a7-ebf6-4f04-a52c-c8d2bacb87f7/manager/0.log" Jan 27 17:52:01 crc kubenswrapper[4772]: I0127 17:52:01.792033 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-79d5ccc684-vwnwk_c3d2f06d-4dd3-49a8-a0a0-54a83cc3f4e8/manager/0.log" Jan 27 17:52:01 crc kubenswrapper[4772]: I0127 17:52:01.946846 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h9297_abaf1142-1b7c-4987-8a9d-c91e6456c4a5/operator/0.log" Jan 27 17:52:02 crc kubenswrapper[4772]: I0127 17:52:02.188699 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-547cbdb99f-l8d48_c09741c3-6bae-487a-9b4c-7c9f01d8c5bf/manager/0.log" Jan 27 17:52:02 crc kubenswrapper[4772]: I0127 17:52:02.313771 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-799bc87c89-k2l8k_e4a99865-64a7-49e5-bdce-ff929105fc0d/manager/0.log" Jan 27 17:52:02 crc kubenswrapper[4772]: I0127 17:52:02.461743 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-69797bbcbd-ln7xf_6242683c-24ad-4e22-a7b3-8463e07388c2/manager/0.log" Jan 27 17:52:02 crc kubenswrapper[4772]: I0127 17:52:02.572217 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c9bb4b66c-ws2mh_783285f4-2e9d-4af5-b017-32676e7d1b01/manager/0.log" Jan 27 17:52:02 crc kubenswrapper[4772]: I0127 17:52:02.758904 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-ff554fc88-clt4p_8087d8d3-d2f6-4bca-abec-f5b5335f26fa/manager/0.log" Jan 27 17:52:05 crc kubenswrapper[4772]: I0127 17:52:05.663877 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:52:05 crc kubenswrapper[4772]: E0127 17:52:05.665514 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:52:19 crc kubenswrapper[4772]: I0127 17:52:19.682272 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:52:19 crc kubenswrapper[4772]: E0127 17:52:19.683158 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:52:23 crc kubenswrapper[4772]: I0127 17:52:23.938504 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5vmlj_db4a3858-5afa-44c8-a435-2010f7e7340d/control-plane-machine-set-operator/0.log" Jan 27 17:52:24 crc kubenswrapper[4772]: I0127 17:52:24.150791 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mfh29_625f7e2d-0e3f-4c2c-8f49-b09fc3638536/kube-rbac-proxy/0.log" Jan 27 17:52:24 crc kubenswrapper[4772]: I0127 17:52:24.214265 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mfh29_625f7e2d-0e3f-4c2c-8f49-b09fc3638536/machine-api-operator/0.log" Jan 27 17:52:34 crc kubenswrapper[4772]: I0127 17:52:34.663476 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:52:34 crc kubenswrapper[4772]: E0127 17:52:34.664534 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:52:38 crc kubenswrapper[4772]: I0127 17:52:38.670371 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-bmz8p_7b51171e-5d65-415e-8052-3cc8991f5de4/cert-manager-controller/0.log" Jan 27 17:52:38 crc kubenswrapper[4772]: I0127 17:52:38.741941 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-s7ksj_8d401dfc-33d3-416f-abba-cad4a1e173bd/cert-manager-cainjector/0.log" Jan 27 17:52:38 crc kubenswrapper[4772]: I0127 17:52:38.849453 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-4886f_ab67a3dd-5a79-400f-9b27-294ef256823d/cert-manager-webhook/0.log" Jan 27 17:52:48 crc kubenswrapper[4772]: I0127 17:52:48.663251 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:52:48 crc kubenswrapper[4772]: E0127 17:52:48.663878 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:52:52 crc kubenswrapper[4772]: I0127 17:52:52.727064 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-jlq99_3706a5f9-4370-4cca-abb9-b23e8b9c828f/nmstate-console-plugin/0.log" Jan 27 17:52:52 crc kubenswrapper[4772]: I0127 17:52:52.900436 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mmdc5_d834ccf6-9b3a-4a3e-8980-7f0a102babd0/nmstate-handler/0.log" Jan 27 17:52:52 crc kubenswrapper[4772]: I0127 17:52:52.963964 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-g7d66_004d59b7-1d3b-41af-8c3d-c6562dd9716a/kube-rbac-proxy/0.log" Jan 27 17:52:52 crc kubenswrapper[4772]: I0127 17:52:52.975233 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-g7d66_004d59b7-1d3b-41af-8c3d-c6562dd9716a/nmstate-metrics/0.log" Jan 27 17:52:53 crc kubenswrapper[4772]: I0127 17:52:53.186195 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-gpfqc_cf21b49c-f01b-4c7c-bdb9-57e115b364d9/nmstate-webhook/0.log" Jan 27 17:52:53 crc kubenswrapper[4772]: I0127 17:52:53.208610 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mdjph_2914eab0-19c8-464b-a774-d30a492f6763/nmstate-operator/0.log" Jan 27 17:52:59 crc kubenswrapper[4772]: I0127 17:52:59.663087 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:52:59 crc kubenswrapper[4772]: E0127 17:52:59.663831 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:53:13 crc kubenswrapper[4772]: I0127 17:53:13.663516 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:53:13 crc kubenswrapper[4772]: E0127 17:53:13.664465 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.032327 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vdg69_d2282b46-452e-402e-b929-23875b572727/kube-rbac-proxy/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.211790 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-frr-files/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.298642 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-vdg69_d2282b46-452e-402e-b929-23875b572727/controller/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.400020 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-reloader/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.426023 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-frr-files/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.475542 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-metrics/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.485610 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-reloader/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.623184 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-frr-files/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.659031 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-reloader/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.662828 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-metrics/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.662838 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-metrics/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.878454 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-frr-files/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.889741 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/controller/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.915290 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-metrics/0.log" Jan 27 17:53:24 crc kubenswrapper[4772]: I0127 17:53:24.921726 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/cp-reloader/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.064714 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/kube-rbac-proxy/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.065464 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/frr-metrics/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.087536 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/kube-rbac-proxy-frr/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.229869 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/reloader/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.318643 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-qpxrs_28ed9da3-cd29-4d80-9703-472bdbb3c64b/frr-k8s-webhook-server/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.453857 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5c6dd9c74b-84qz4_f72c611d-60d8-4649-a410-38434d01d8e2/manager/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.627745 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66986f9f9f-bmvms_c5ee8d7f-0160-4526-8ae0-45a50a450725/webhook-server/0.log" Jan 27 17:53:25 crc kubenswrapper[4772]: I0127 17:53:25.719818 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cl54q_24577bed-b34e-4419-9e9e-7068155ba0d1/kube-rbac-proxy/0.log" Jan 27 17:53:26 crc kubenswrapper[4772]: I0127 17:53:26.610611 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-cl54q_24577bed-b34e-4419-9e9e-7068155ba0d1/speaker/0.log" Jan 27 17:53:27 crc kubenswrapper[4772]: I0127 17:53:27.664094 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:53:27 crc kubenswrapper[4772]: E0127 17:53:27.664818 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:53:27 crc kubenswrapper[4772]: I0127 17:53:27.829407 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jhpnb_5d41fcac-7044-4f36-b9f8-0b656bb3bcca/frr/0.log" Jan 27 17:53:41 crc kubenswrapper[4772]: I0127 17:53:41.339769 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/util/0.log" Jan 27 17:53:41 crc kubenswrapper[4772]: I0127 17:53:41.485186 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/util/0.log" Jan 27 17:53:41 crc kubenswrapper[4772]: I0127 17:53:41.521899 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/pull/0.log" Jan 27 17:53:41 crc kubenswrapper[4772]: I0127 17:53:41.563690 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/pull/0.log" Jan 27 17:53:41 crc kubenswrapper[4772]: I0127 17:53:41.662957 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:53:41 crc kubenswrapper[4772]: E0127 17:53:41.663316 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.195078 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/util/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.215365 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/pull/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.242097 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a4lwpg_44b239be-466d-4995-9c33-38d68a00550d/extract/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.402066 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/util/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.594131 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/pull/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.614982 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/util/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.651215 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/pull/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.782669 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/util/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.814282 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/pull/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.843326 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcbzzsk_09090577-fdfa-4f36-badf-f32c6ee2ab7d/extract/0.log" Jan 27 17:53:42 crc kubenswrapper[4772]: I0127 17:53:42.991222 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/util/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.163728 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/pull/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.180718 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/util/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.214851 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/pull/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.413117 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/extract/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.432474 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/util/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.442908 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713vlmsf_d583171b-99cd-49da-9a9f-48931806cb45/pull/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.599989 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/extract-utilities/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.766688 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/extract-utilities/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.790340 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/extract-content/0.log" Jan 27 17:53:43 crc kubenswrapper[4772]: I0127 17:53:43.802319 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/extract-content/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.017056 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/extract-content/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.056780 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/extract-utilities/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.264778 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/extract-utilities/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.430612 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/extract-utilities/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.478597 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/extract-content/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.482869 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/extract-content/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.861159 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/extract-utilities/0.log" Jan 27 17:53:44 crc kubenswrapper[4772]: I0127 17:53:44.905426 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/extract-content/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.085944 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/2.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.155591 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2glnd_d8591d45-25d0-47ea-a856-9cd5334e4a8c/marketplace-operator/3.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.322067 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/extract-utilities/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.436231 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-dfvcs_881b071c-048c-4f66-96e7-fd1f91ca23f8/registry-server/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.539903 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kb62q_e840c358-8d41-4381-b643-3bd35f0716a2/registry-server/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.565829 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/extract-utilities/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.578324 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/extract-content/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.585359 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/extract-content/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.746575 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/extract-content/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.788595 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/extract-utilities/0.log" Jan 27 17:53:45 crc kubenswrapper[4772]: I0127 17:53:45.839570 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/extract-utilities/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.091421 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f5shj_72dcc284-e96b-4605-a428-176ca549eeb2/registry-server/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.097162 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/extract-utilities/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.101760 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/extract-content/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.130801 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/extract-content/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.258270 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/extract-utilities/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.332344 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/extract-content/0.log" Jan 27 17:53:46 crc kubenswrapper[4772]: I0127 17:53:46.511602 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-sdbgg_09211095-3894-4db1-bcea-29d1c2064979/registry-server/0.log" Jan 27 17:53:54 crc kubenswrapper[4772]: I0127 17:53:54.670491 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:53:54 crc kubenswrapper[4772]: E0127 17:53:54.671450 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:54:05 crc kubenswrapper[4772]: I0127 17:54:05.662588 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:54:05 crc kubenswrapper[4772]: E0127 17:54:05.663229 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.815387 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vnhgt"] Jan 27 17:54:08 crc kubenswrapper[4772]: E0127 17:54:08.816619 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="extract-content" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.816633 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="extract-content" Jan 27 17:54:08 crc kubenswrapper[4772]: E0127 17:54:08.816652 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.816658 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" Jan 27 17:54:08 crc kubenswrapper[4772]: E0127 17:54:08.816680 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="extract-utilities" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.816687 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="extract-utilities" Jan 27 17:54:08 crc kubenswrapper[4772]: E0127 17:54:08.816715 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" containerName="container-00" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.816721 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" containerName="container-00" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.817015 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ac0d9e-ba08-48c0-9a6c-307c3438b86e" containerName="container-00" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.817039 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bec6411-aae2-47f7-a721-a39d6d381c17" containerName="registry-server" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.826341 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.858125 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnhgt"] Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.938529 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-catalog-content\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.938602 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-utilities\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:08 crc kubenswrapper[4772]: I0127 17:54:08.938643 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/41b8d2e5-185e-4766-82dc-475963959026-kube-api-access-226hp\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.040336 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-catalog-content\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.040407 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-utilities\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.040446 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/41b8d2e5-185e-4766-82dc-475963959026-kube-api-access-226hp\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.040957 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-utilities\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.041031 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-catalog-content\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.062830 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/41b8d2e5-185e-4766-82dc-475963959026-kube-api-access-226hp\") pod \"community-operators-vnhgt\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.153849 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:09 crc kubenswrapper[4772]: I0127 17:54:09.973852 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnhgt"] Jan 27 17:54:10 crc kubenswrapper[4772]: I0127 17:54:10.370296 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnhgt" event={"ID":"41b8d2e5-185e-4766-82dc-475963959026","Type":"ContainerStarted","Data":"199265b153397fccb7e5a85666b63dd6b5b8fed3e925fc3bb5d59089f4acf311"} Jan 27 17:54:11 crc kubenswrapper[4772]: I0127 17:54:11.380500 4772 generic.go:334] "Generic (PLEG): container finished" podID="41b8d2e5-185e-4766-82dc-475963959026" containerID="f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005" exitCode=0 Jan 27 17:54:11 crc kubenswrapper[4772]: I0127 17:54:11.380779 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnhgt" event={"ID":"41b8d2e5-185e-4766-82dc-475963959026","Type":"ContainerDied","Data":"f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005"} Jan 27 17:54:11 crc kubenswrapper[4772]: I0127 17:54:11.382935 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 17:54:13 crc kubenswrapper[4772]: I0127 17:54:13.402139 4772 generic.go:334] "Generic (PLEG): container finished" podID="41b8d2e5-185e-4766-82dc-475963959026" containerID="e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a" exitCode=0 Jan 27 17:54:13 crc kubenswrapper[4772]: I0127 17:54:13.402253 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnhgt" event={"ID":"41b8d2e5-185e-4766-82dc-475963959026","Type":"ContainerDied","Data":"e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a"} Jan 27 17:54:14 crc kubenswrapper[4772]: I0127 17:54:14.413487 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnhgt" event={"ID":"41b8d2e5-185e-4766-82dc-475963959026","Type":"ContainerStarted","Data":"aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115"} Jan 27 17:54:14 crc kubenswrapper[4772]: I0127 17:54:14.432452 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vnhgt" podStartSLOduration=3.985287484 podStartE2EDuration="6.432427148s" podCreationTimestamp="2026-01-27 17:54:08 +0000 UTC" firstStartedPulling="2026-01-27 17:54:11.38267879 +0000 UTC m=+10037.363287898" lastFinishedPulling="2026-01-27 17:54:13.829818464 +0000 UTC m=+10039.810427562" observedRunningTime="2026-01-27 17:54:14.42830048 +0000 UTC m=+10040.408909578" watchObservedRunningTime="2026-01-27 17:54:14.432427148 +0000 UTC m=+10040.413036246" Jan 27 17:54:19 crc kubenswrapper[4772]: I0127 17:54:19.154985 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:19 crc kubenswrapper[4772]: I0127 17:54:19.156620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:19 crc kubenswrapper[4772]: I0127 17:54:19.233329 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:19 crc kubenswrapper[4772]: I0127 17:54:19.509566 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:19 crc kubenswrapper[4772]: I0127 17:54:19.564098 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnhgt"] Jan 27 17:54:19 crc kubenswrapper[4772]: I0127 17:54:19.662890 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:54:19 crc kubenswrapper[4772]: E0127 17:54:19.663156 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:54:21 crc kubenswrapper[4772]: I0127 17:54:21.492624 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vnhgt" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="registry-server" containerID="cri-o://aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115" gracePeriod=2 Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.051838 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.108675 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/41b8d2e5-185e-4766-82dc-475963959026-kube-api-access-226hp\") pod \"41b8d2e5-185e-4766-82dc-475963959026\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.108801 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-catalog-content\") pod \"41b8d2e5-185e-4766-82dc-475963959026\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.108820 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-utilities\") pod \"41b8d2e5-185e-4766-82dc-475963959026\" (UID: \"41b8d2e5-185e-4766-82dc-475963959026\") " Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.109691 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-utilities" (OuterVolumeSpecName: "utilities") pod "41b8d2e5-185e-4766-82dc-475963959026" (UID: "41b8d2e5-185e-4766-82dc-475963959026"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.116533 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b8d2e5-185e-4766-82dc-475963959026-kube-api-access-226hp" (OuterVolumeSpecName: "kube-api-access-226hp") pod "41b8d2e5-185e-4766-82dc-475963959026" (UID: "41b8d2e5-185e-4766-82dc-475963959026"). InnerVolumeSpecName "kube-api-access-226hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.181160 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41b8d2e5-185e-4766-82dc-475963959026" (UID: "41b8d2e5-185e-4766-82dc-475963959026"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.210408 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.210449 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41b8d2e5-185e-4766-82dc-475963959026-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.210459 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-226hp\" (UniqueName: \"kubernetes.io/projected/41b8d2e5-185e-4766-82dc-475963959026-kube-api-access-226hp\") on node \"crc\" DevicePath \"\"" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.502750 4772 generic.go:334] "Generic (PLEG): container finished" podID="41b8d2e5-185e-4766-82dc-475963959026" containerID="aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115" exitCode=0 Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.502794 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnhgt" event={"ID":"41b8d2e5-185e-4766-82dc-475963959026","Type":"ContainerDied","Data":"aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115"} Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.502821 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnhgt" event={"ID":"41b8d2e5-185e-4766-82dc-475963959026","Type":"ContainerDied","Data":"199265b153397fccb7e5a85666b63dd6b5b8fed3e925fc3bb5d59089f4acf311"} Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.502837 4772 scope.go:117] "RemoveContainer" containerID="aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.502839 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnhgt" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.522019 4772 scope.go:117] "RemoveContainer" containerID="e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.547401 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnhgt"] Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.557936 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vnhgt"] Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.560127 4772 scope.go:117] "RemoveContainer" containerID="f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.594460 4772 scope.go:117] "RemoveContainer" containerID="aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115" Jan 27 17:54:22 crc kubenswrapper[4772]: E0127 17:54:22.595892 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115\": container with ID starting with aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115 not found: ID does not exist" containerID="aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.595936 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115"} err="failed to get container status \"aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115\": rpc error: code = NotFound desc = could not find container \"aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115\": container with ID starting with aac79acc3e134a8a805aefd2c496fbc70975b6548e84461983c168b510e44115 not found: ID does not exist" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.595964 4772 scope.go:117] "RemoveContainer" containerID="e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a" Jan 27 17:54:22 crc kubenswrapper[4772]: E0127 17:54:22.596294 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a\": container with ID starting with e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a not found: ID does not exist" containerID="e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.596331 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a"} err="failed to get container status \"e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a\": rpc error: code = NotFound desc = could not find container \"e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a\": container with ID starting with e380eca5b91487ee3d86108869a3d28c49ca1c5d2415c5ccd5dcff2e75cd190a not found: ID does not exist" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.596371 4772 scope.go:117] "RemoveContainer" containerID="f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005" Jan 27 17:54:22 crc kubenswrapper[4772]: E0127 17:54:22.596607 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005\": container with ID starting with f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005 not found: ID does not exist" containerID="f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.596825 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005"} err="failed to get container status \"f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005\": rpc error: code = NotFound desc = could not find container \"f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005\": container with ID starting with f41b1e666393e5a2661e4c583dc4dc3e931e50487a86a7a14b7e7e6d2d45c005 not found: ID does not exist" Jan 27 17:54:22 crc kubenswrapper[4772]: I0127 17:54:22.676663 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b8d2e5-185e-4766-82dc-475963959026" path="/var/lib/kubelet/pods/41b8d2e5-185e-4766-82dc-475963959026/volumes" Jan 27 17:54:30 crc kubenswrapper[4772]: I0127 17:54:30.663922 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:54:30 crc kubenswrapper[4772]: E0127 17:54:30.665032 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:54:45 crc kubenswrapper[4772]: I0127 17:54:45.667890 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:54:45 crc kubenswrapper[4772]: E0127 17:54:45.669743 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:54:58 crc kubenswrapper[4772]: I0127 17:54:58.663944 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:54:58 crc kubenswrapper[4772]: E0127 17:54:58.664959 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:55:10 crc kubenswrapper[4772]: I0127 17:55:10.663996 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:55:10 crc kubenswrapper[4772]: E0127 17:55:10.664955 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.035802 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5kr2z"] Jan 27 17:55:13 crc kubenswrapper[4772]: E0127 17:55:13.036723 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="registry-server" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.036739 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="registry-server" Jan 27 17:55:13 crc kubenswrapper[4772]: E0127 17:55:13.036769 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="extract-content" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.036775 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="extract-content" Jan 27 17:55:13 crc kubenswrapper[4772]: E0127 17:55:13.036791 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="extract-utilities" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.036798 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="extract-utilities" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.037027 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b8d2e5-185e-4766-82dc-475963959026" containerName="registry-server" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.039728 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.047645 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kr2z"] Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.197659 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/84e9f92a-7805-4587-9826-afda667e7ee1-kube-api-access-7tdd8\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.197740 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-utilities\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.197763 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-catalog-content\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.299716 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/84e9f92a-7805-4587-9826-afda667e7ee1-kube-api-access-7tdd8\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.299790 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-utilities\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.299819 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-catalog-content\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.300420 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-catalog-content\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.300421 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-utilities\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.333433 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/84e9f92a-7805-4587-9826-afda667e7ee1-kube-api-access-7tdd8\") pod \"redhat-marketplace-5kr2z\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.414154 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:13 crc kubenswrapper[4772]: I0127 17:55:13.938416 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kr2z"] Jan 27 17:55:15 crc kubenswrapper[4772]: I0127 17:55:15.012296 4772 generic.go:334] "Generic (PLEG): container finished" podID="84e9f92a-7805-4587-9826-afda667e7ee1" containerID="24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c" exitCode=0 Jan 27 17:55:15 crc kubenswrapper[4772]: I0127 17:55:15.012354 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerDied","Data":"24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c"} Jan 27 17:55:15 crc kubenswrapper[4772]: I0127 17:55:15.012661 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerStarted","Data":"37b015df7b3b2fcbad3ebd639dc3d4bb0302697c171f372c59ec684cae6a5d2f"} Jan 27 17:55:16 crc kubenswrapper[4772]: I0127 17:55:16.024401 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerStarted","Data":"d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb"} Jan 27 17:55:17 crc kubenswrapper[4772]: I0127 17:55:17.037232 4772 generic.go:334] "Generic (PLEG): container finished" podID="84e9f92a-7805-4587-9826-afda667e7ee1" containerID="d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb" exitCode=0 Jan 27 17:55:17 crc kubenswrapper[4772]: I0127 17:55:17.037641 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerDied","Data":"d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb"} Jan 27 17:55:18 crc kubenswrapper[4772]: I0127 17:55:18.049507 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerStarted","Data":"d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa"} Jan 27 17:55:18 crc kubenswrapper[4772]: I0127 17:55:18.080295 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5kr2z" podStartSLOduration=2.511108062 podStartE2EDuration="5.080275438s" podCreationTimestamp="2026-01-27 17:55:13 +0000 UTC" firstStartedPulling="2026-01-27 17:55:15.014573336 +0000 UTC m=+10100.995182434" lastFinishedPulling="2026-01-27 17:55:17.583740712 +0000 UTC m=+10103.564349810" observedRunningTime="2026-01-27 17:55:18.073253458 +0000 UTC m=+10104.053862566" watchObservedRunningTime="2026-01-27 17:55:18.080275438 +0000 UTC m=+10104.060884546" Jan 27 17:55:21 crc kubenswrapper[4772]: I0127 17:55:21.663489 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:55:21 crc kubenswrapper[4772]: E0127 17:55:21.664137 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:55:23 crc kubenswrapper[4772]: I0127 17:55:23.414942 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:23 crc kubenswrapper[4772]: I0127 17:55:23.415262 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:23 crc kubenswrapper[4772]: I0127 17:55:23.467390 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:24 crc kubenswrapper[4772]: I0127 17:55:24.413124 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:25 crc kubenswrapper[4772]: I0127 17:55:25.019737 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kr2z"] Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.127995 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5kr2z" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="registry-server" containerID="cri-o://d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa" gracePeriod=2 Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.662227 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.774155 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-utilities\") pod \"84e9f92a-7805-4587-9826-afda667e7ee1\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.774297 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-catalog-content\") pod \"84e9f92a-7805-4587-9826-afda667e7ee1\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.774479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/84e9f92a-7805-4587-9826-afda667e7ee1-kube-api-access-7tdd8\") pod \"84e9f92a-7805-4587-9826-afda667e7ee1\" (UID: \"84e9f92a-7805-4587-9826-afda667e7ee1\") " Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.785390 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e9f92a-7805-4587-9826-afda667e7ee1-kube-api-access-7tdd8" (OuterVolumeSpecName: "kube-api-access-7tdd8") pod "84e9f92a-7805-4587-9826-afda667e7ee1" (UID: "84e9f92a-7805-4587-9826-afda667e7ee1"). InnerVolumeSpecName "kube-api-access-7tdd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.788488 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-utilities" (OuterVolumeSpecName: "utilities") pod "84e9f92a-7805-4587-9826-afda667e7ee1" (UID: "84e9f92a-7805-4587-9826-afda667e7ee1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.877525 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.877743 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tdd8\" (UniqueName: \"kubernetes.io/projected/84e9f92a-7805-4587-9826-afda667e7ee1-kube-api-access-7tdd8\") on node \"crc\" DevicePath \"\"" Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.908837 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84e9f92a-7805-4587-9826-afda667e7ee1" (UID: "84e9f92a-7805-4587-9826-afda667e7ee1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:55:26 crc kubenswrapper[4772]: I0127 17:55:26.980419 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e9f92a-7805-4587-9826-afda667e7ee1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.139479 4772 generic.go:334] "Generic (PLEG): container finished" podID="84e9f92a-7805-4587-9826-afda667e7ee1" containerID="d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa" exitCode=0 Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.139538 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerDied","Data":"d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa"} Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.139577 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kr2z" event={"ID":"84e9f92a-7805-4587-9826-afda667e7ee1","Type":"ContainerDied","Data":"37b015df7b3b2fcbad3ebd639dc3d4bb0302697c171f372c59ec684cae6a5d2f"} Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.139606 4772 scope.go:117] "RemoveContainer" containerID="d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.139779 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kr2z" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.184091 4772 scope.go:117] "RemoveContainer" containerID="d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.195685 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kr2z"] Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.210216 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kr2z"] Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.217397 4772 scope.go:117] "RemoveContainer" containerID="24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.254232 4772 scope.go:117] "RemoveContainer" containerID="d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa" Jan 27 17:55:27 crc kubenswrapper[4772]: E0127 17:55:27.254961 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa\": container with ID starting with d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa not found: ID does not exist" containerID="d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.255003 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa"} err="failed to get container status \"d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa\": rpc error: code = NotFound desc = could not find container \"d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa\": container with ID starting with d7125b1b6edd7114a279766eaf0708862a3658f1e482bfc3610c0145352fb0fa not found: ID does not exist" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.255030 4772 scope.go:117] "RemoveContainer" containerID="d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb" Jan 27 17:55:27 crc kubenswrapper[4772]: E0127 17:55:27.255423 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb\": container with ID starting with d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb not found: ID does not exist" containerID="d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.255465 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb"} err="failed to get container status \"d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb\": rpc error: code = NotFound desc = could not find container \"d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb\": container with ID starting with d4b51a7cc091f99cda607405d2e8f9684e13bdb11c929a5b8c5b1ac513e7eabb not found: ID does not exist" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.255560 4772 scope.go:117] "RemoveContainer" containerID="24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c" Jan 27 17:55:27 crc kubenswrapper[4772]: E0127 17:55:27.255946 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c\": container with ID starting with 24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c not found: ID does not exist" containerID="24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c" Jan 27 17:55:27 crc kubenswrapper[4772]: I0127 17:55:27.255989 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c"} err="failed to get container status \"24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c\": rpc error: code = NotFound desc = could not find container \"24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c\": container with ID starting with 24934acfd4ff03828952a4d233d786064c58ec45adfd8f4996766292937bef9c not found: ID does not exist" Jan 27 17:55:28 crc kubenswrapper[4772]: I0127 17:55:28.683243 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" path="/var/lib/kubelet/pods/84e9f92a-7805-4587-9826-afda667e7ee1/volumes" Jan 27 17:55:32 crc kubenswrapper[4772]: I0127 17:55:32.202690 4772 generic.go:334] "Generic (PLEG): container finished" podID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerID="daea92a47f41c2b8eac7ad3a9eb9829829dd28a915856ca1714eacc250cb2602" exitCode=0 Jan 27 17:55:32 crc kubenswrapper[4772]: I0127 17:55:32.202827 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" event={"ID":"62afdf43-7cc5-4d53-aff8-2fd18fbfd493","Type":"ContainerDied","Data":"daea92a47f41c2b8eac7ad3a9eb9829829dd28a915856ca1714eacc250cb2602"} Jan 27 17:55:32 crc kubenswrapper[4772]: I0127 17:55:32.204365 4772 scope.go:117] "RemoveContainer" containerID="daea92a47f41c2b8eac7ad3a9eb9829829dd28a915856ca1714eacc250cb2602" Jan 27 17:55:32 crc kubenswrapper[4772]: I0127 17:55:32.489368 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7dvs7_must-gather-vr6mx_62afdf43-7cc5-4d53-aff8-2fd18fbfd493/gather/0.log" Jan 27 17:55:36 crc kubenswrapper[4772]: I0127 17:55:36.664076 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:55:36 crc kubenswrapper[4772]: E0127 17:55:36.665626 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4hwxn_openshift-machine-config-operator(67794a44-d793-4fd7-9e54-e40437f67c0b)\"" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.035519 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7dvs7/must-gather-vr6mx"] Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.036038 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="copy" containerID="cri-o://7d2e1870707eb567a3ae7d6fc16a9352a0ad811fd3af2c67ad9db194e0056b53" gracePeriod=2 Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.043516 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7dvs7/must-gather-vr6mx"] Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.290945 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7dvs7_must-gather-vr6mx_62afdf43-7cc5-4d53-aff8-2fd18fbfd493/copy/0.log" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.291745 4772 generic.go:334] "Generic (PLEG): container finished" podID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerID="7d2e1870707eb567a3ae7d6fc16a9352a0ad811fd3af2c67ad9db194e0056b53" exitCode=143 Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.467800 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7dvs7_must-gather-vr6mx_62afdf43-7cc5-4d53-aff8-2fd18fbfd493/copy/0.log" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.468286 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.586039 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-must-gather-output\") pod \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.586300 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwwmf\" (UniqueName: \"kubernetes.io/projected/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-kube-api-access-vwwmf\") pod \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\" (UID: \"62afdf43-7cc5-4d53-aff8-2fd18fbfd493\") " Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.600469 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-kube-api-access-vwwmf" (OuterVolumeSpecName: "kube-api-access-vwwmf") pod "62afdf43-7cc5-4d53-aff8-2fd18fbfd493" (UID: "62afdf43-7cc5-4d53-aff8-2fd18fbfd493"). InnerVolumeSpecName "kube-api-access-vwwmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.691665 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwwmf\" (UniqueName: \"kubernetes.io/projected/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-kube-api-access-vwwmf\") on node \"crc\" DevicePath \"\"" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.754134 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "62afdf43-7cc5-4d53-aff8-2fd18fbfd493" (UID: "62afdf43-7cc5-4d53-aff8-2fd18fbfd493"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:55:40 crc kubenswrapper[4772]: I0127 17:55:40.795331 4772 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62afdf43-7cc5-4d53-aff8-2fd18fbfd493-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 27 17:55:41 crc kubenswrapper[4772]: I0127 17:55:41.313068 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7dvs7_must-gather-vr6mx_62afdf43-7cc5-4d53-aff8-2fd18fbfd493/copy/0.log" Jan 27 17:55:41 crc kubenswrapper[4772]: I0127 17:55:41.313803 4772 scope.go:117] "RemoveContainer" containerID="7d2e1870707eb567a3ae7d6fc16a9352a0ad811fd3af2c67ad9db194e0056b53" Jan 27 17:55:41 crc kubenswrapper[4772]: I0127 17:55:41.313982 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7dvs7/must-gather-vr6mx" Jan 27 17:55:41 crc kubenswrapper[4772]: I0127 17:55:41.339797 4772 scope.go:117] "RemoveContainer" containerID="daea92a47f41c2b8eac7ad3a9eb9829829dd28a915856ca1714eacc250cb2602" Jan 27 17:55:42 crc kubenswrapper[4772]: I0127 17:55:42.683069 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" path="/var/lib/kubelet/pods/62afdf43-7cc5-4d53-aff8-2fd18fbfd493/volumes" Jan 27 17:55:51 crc kubenswrapper[4772]: I0127 17:55:51.663196 4772 scope.go:117] "RemoveContainer" containerID="b406a942c3fb69b474f2dc48f4fd84a565681c5c0e15f723d9fd971770b5e025" Jan 27 17:55:52 crc kubenswrapper[4772]: I0127 17:55:52.422952 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" event={"ID":"67794a44-d793-4fd7-9e54-e40437f67c0b","Type":"ContainerStarted","Data":"27602b2436534c6f3f068e3bb084a37c973a59402b8f43ca60850070bae7c8e1"} Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.791972 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqpxg"] Jan 27 17:57:35 crc kubenswrapper[4772]: E0127 17:57:35.793257 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="gather" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793283 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="gather" Jan 27 17:57:35 crc kubenswrapper[4772]: E0127 17:57:35.793317 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="extract-utilities" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793330 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="extract-utilities" Jan 27 17:57:35 crc kubenswrapper[4772]: E0127 17:57:35.793367 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="extract-content" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793380 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="extract-content" Jan 27 17:57:35 crc kubenswrapper[4772]: E0127 17:57:35.793410 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="registry-server" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793422 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="registry-server" Jan 27 17:57:35 crc kubenswrapper[4772]: E0127 17:57:35.793451 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="copy" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793463 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="copy" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793774 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e9f92a-7805-4587-9826-afda667e7ee1" containerName="registry-server" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793811 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="copy" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.793849 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="62afdf43-7cc5-4d53-aff8-2fd18fbfd493" containerName="gather" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.796439 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.824540 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqpxg"] Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.965913 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27mzs\" (UniqueName: \"kubernetes.io/projected/a29fc4ba-71f0-482a-8d16-888aff51e921-kube-api-access-27mzs\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.965994 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-catalog-content\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:35 crc kubenswrapper[4772]: I0127 17:57:35.966012 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-utilities\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.067246 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27mzs\" (UniqueName: \"kubernetes.io/projected/a29fc4ba-71f0-482a-8d16-888aff51e921-kube-api-access-27mzs\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.067332 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-catalog-content\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.067360 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-utilities\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.067919 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-utilities\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.068470 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-catalog-content\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.098062 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27mzs\" (UniqueName: \"kubernetes.io/projected/a29fc4ba-71f0-482a-8d16-888aff51e921-kube-api-access-27mzs\") pod \"certified-operators-hqpxg\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.166736 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:36 crc kubenswrapper[4772]: I0127 17:57:36.771506 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqpxg"] Jan 27 17:57:37 crc kubenswrapper[4772]: I0127 17:57:37.518405 4772 generic.go:334] "Generic (PLEG): container finished" podID="a29fc4ba-71f0-482a-8d16-888aff51e921" containerID="3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea" exitCode=0 Jan 27 17:57:37 crc kubenswrapper[4772]: I0127 17:57:37.518827 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerDied","Data":"3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea"} Jan 27 17:57:37 crc kubenswrapper[4772]: I0127 17:57:37.518861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerStarted","Data":"d0d879401429a76bebba81c1b4334ca4f4e0c68e48881234b455c1831c80224e"} Jan 27 17:57:38 crc kubenswrapper[4772]: I0127 17:57:38.528724 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerStarted","Data":"1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f"} Jan 27 17:57:39 crc kubenswrapper[4772]: I0127 17:57:39.553668 4772 generic.go:334] "Generic (PLEG): container finished" podID="a29fc4ba-71f0-482a-8d16-888aff51e921" containerID="1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f" exitCode=0 Jan 27 17:57:39 crc kubenswrapper[4772]: I0127 17:57:39.554151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerDied","Data":"1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f"} Jan 27 17:57:40 crc kubenswrapper[4772]: I0127 17:57:40.567071 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerStarted","Data":"74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7"} Jan 27 17:57:40 crc kubenswrapper[4772]: I0127 17:57:40.589302 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqpxg" podStartSLOduration=3.109391427 podStartE2EDuration="5.589281316s" podCreationTimestamp="2026-01-27 17:57:35 +0000 UTC" firstStartedPulling="2026-01-27 17:57:37.520957007 +0000 UTC m=+10243.501566105" lastFinishedPulling="2026-01-27 17:57:40.000846856 +0000 UTC m=+10245.981455994" observedRunningTime="2026-01-27 17:57:40.587553086 +0000 UTC m=+10246.568162234" watchObservedRunningTime="2026-01-27 17:57:40.589281316 +0000 UTC m=+10246.569890414" Jan 27 17:57:46 crc kubenswrapper[4772]: I0127 17:57:46.167524 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:46 crc kubenswrapper[4772]: I0127 17:57:46.168344 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:46 crc kubenswrapper[4772]: I0127 17:57:46.222991 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:46 crc kubenswrapper[4772]: I0127 17:57:46.693931 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:46 crc kubenswrapper[4772]: I0127 17:57:46.747996 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqpxg"] Jan 27 17:57:48 crc kubenswrapper[4772]: I0127 17:57:48.646722 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqpxg" podUID="a29fc4ba-71f0-482a-8d16-888aff51e921" containerName="registry-server" containerID="cri-o://74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7" gracePeriod=2 Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.147869 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.255589 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-catalog-content\") pod \"a29fc4ba-71f0-482a-8d16-888aff51e921\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.255805 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27mzs\" (UniqueName: \"kubernetes.io/projected/a29fc4ba-71f0-482a-8d16-888aff51e921-kube-api-access-27mzs\") pod \"a29fc4ba-71f0-482a-8d16-888aff51e921\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.255971 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-utilities\") pod \"a29fc4ba-71f0-482a-8d16-888aff51e921\" (UID: \"a29fc4ba-71f0-482a-8d16-888aff51e921\") " Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.257466 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-utilities" (OuterVolumeSpecName: "utilities") pod "a29fc4ba-71f0-482a-8d16-888aff51e921" (UID: "a29fc4ba-71f0-482a-8d16-888aff51e921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.258439 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.261872 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a29fc4ba-71f0-482a-8d16-888aff51e921-kube-api-access-27mzs" (OuterVolumeSpecName: "kube-api-access-27mzs") pod "a29fc4ba-71f0-482a-8d16-888aff51e921" (UID: "a29fc4ba-71f0-482a-8d16-888aff51e921"). InnerVolumeSpecName "kube-api-access-27mzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.311282 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a29fc4ba-71f0-482a-8d16-888aff51e921" (UID: "a29fc4ba-71f0-482a-8d16-888aff51e921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.360700 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a29fc4ba-71f0-482a-8d16-888aff51e921-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.360733 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27mzs\" (UniqueName: \"kubernetes.io/projected/a29fc4ba-71f0-482a-8d16-888aff51e921-kube-api-access-27mzs\") on node \"crc\" DevicePath \"\"" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.666804 4772 generic.go:334] "Generic (PLEG): container finished" podID="a29fc4ba-71f0-482a-8d16-888aff51e921" containerID="74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7" exitCode=0 Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.666840 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerDied","Data":"74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7"} Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.666856 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqpxg" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.666871 4772 scope.go:117] "RemoveContainer" containerID="74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.666861 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqpxg" event={"ID":"a29fc4ba-71f0-482a-8d16-888aff51e921","Type":"ContainerDied","Data":"d0d879401429a76bebba81c1b4334ca4f4e0c68e48881234b455c1831c80224e"} Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.696488 4772 scope.go:117] "RemoveContainer" containerID="1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.705940 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqpxg"] Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.715810 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqpxg"] Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.726600 4772 scope.go:117] "RemoveContainer" containerID="3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.786673 4772 scope.go:117] "RemoveContainer" containerID="74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7" Jan 27 17:57:49 crc kubenswrapper[4772]: E0127 17:57:49.786972 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7\": container with ID starting with 74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7 not found: ID does not exist" containerID="74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.787012 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7"} err="failed to get container status \"74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7\": rpc error: code = NotFound desc = could not find container \"74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7\": container with ID starting with 74a5bdd2c327de4c131d638a5bfc55206bf72f2cb957ad4b1edb4a47508787d7 not found: ID does not exist" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.787033 4772 scope.go:117] "RemoveContainer" containerID="1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f" Jan 27 17:57:49 crc kubenswrapper[4772]: E0127 17:57:49.787217 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f\": container with ID starting with 1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f not found: ID does not exist" containerID="1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.787242 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f"} err="failed to get container status \"1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f\": rpc error: code = NotFound desc = could not find container \"1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f\": container with ID starting with 1dc2f6290a8ba6dcfac8a909eaee7377853a16025b23d62b1fd7d2d2249b6d7f not found: ID does not exist" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.787257 4772 scope.go:117] "RemoveContainer" containerID="3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea" Jan 27 17:57:49 crc kubenswrapper[4772]: E0127 17:57:49.787435 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea\": container with ID starting with 3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea not found: ID does not exist" containerID="3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea" Jan 27 17:57:49 crc kubenswrapper[4772]: I0127 17:57:49.787455 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea"} err="failed to get container status \"3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea\": rpc error: code = NotFound desc = could not find container \"3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea\": container with ID starting with 3cf46495e6c295b8311bd4db838af9d233d54cd2e679f422d69d5dc99e36eaea not found: ID does not exist" Jan 27 17:57:50 crc kubenswrapper[4772]: I0127 17:57:50.683578 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29fc4ba-71f0-482a-8d16-888aff51e921" path="/var/lib/kubelet/pods/a29fc4ba-71f0-482a-8d16-888aff51e921/volumes" Jan 27 17:58:12 crc kubenswrapper[4772]: I0127 17:58:12.058833 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:58:12 crc kubenswrapper[4772]: I0127 17:58:12.059676 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 17:58:42 crc kubenswrapper[4772]: I0127 17:58:42.058657 4772 patch_prober.go:28] interesting pod/machine-config-daemon-4hwxn container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 17:58:42 crc kubenswrapper[4772]: I0127 17:58:42.075500 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4hwxn" podUID="67794a44-d793-4fd7-9e54-e40437f67c0b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"